Preferred Language
Articles
/
bsj-2678
Multifocus Images Fusion Based On Homogenity and Edges Measures
...Show More Authors

Image fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM, RVS and WT) and spatial techniques (HPFA, HFA and HFM). As these techniques have been developed and build programs using the language MATLAB (b 2010). In this work homogeneity criteria have been suggested for evaluation fused digital image's quality, especially fine details. This criterion is correlation criteria to guess homogeneity in different regions within the image by taking a number of blocks of different regions in the image and different sizes and work shifted blocks per pixel. As dependence was on traditional statistical criteria such as (mean, standard deviation, and signal to noise ratio, mutual information and spatial frequency) and compared with the suggested criteria to the work. The results showed that the evaluation process was effective and well because it took into measure the quality of the homogenous regions.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Jan 20 2023
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Studying the Classification of Texture Images by K-Means of Co-Occurrence Matrix and Confusion Matrix
...Show More Authors

In this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between every

... Show More
Preview PDF
Crossref (5)
Crossref
Publication Date
Wed Aug 30 2023
Journal Name
Baghdad Science Journal
Comparative Analysis of MFO, GWO and GSO for Classification of Covid-19 Chest X-Ray Images
...Show More Authors

Medical images play a crucial role in the classification of various diseases and conditions. One of the imaging modalities is X-rays which provide valuable visual information that helps in the identification and characterization of various medical conditions. Chest radiograph (CXR) images have long been used to examine and monitor numerous lung disorders, such as tuberculosis, pneumonia, atelectasis, and hernia. COVID-19 detection can be accomplished using CXR images as well. COVID-19, a virus that causes infections in the lungs and the airways of the upper respiratory tract, was first discovered in 2019 in Wuhan Province, China, and has since been thought to cause substantial airway damage, badly impacting the lungs of affected persons.

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Crossref
Publication Date
Mon Jun 01 2020
Journal Name
Journal Of Planner And Development
Mapping Paddy Rice Fields Using Landsat and Sentinel Radar Images in Urban Areas for Agriculture Planning
...Show More Authors

     This research develops a new method based on spectral indices and random forest classifier to detect paddy rice areas and then assess their distributions regarding to urban areas. The classification will be conducted on Landsat OLI images and Landsat OLI/Sentinel 1 SAR data. Consequently, developing a new spectral index by analyzing the relative importance of Landsat bands will be calculated by the random forest. The new spectral index has improved depending on the most three important bands, then two additional indices including the normalized difference vegetation index (NDVI), and standardized difference built-up index (NDBI) have been used to extract paddy rice fields from the data. Several experiments being

... Show More
View Publication Preview PDF
Publication Date
Thu Oct 31 2024
Journal Name
Intelligent Automation And Soft Computing
Fusion of Type-2 Neutrosophic Similarity Measure in Signatures Verification Systems: A New Forensic Document Analysis Paradigm
...Show More Authors

Signature verification involves vague situations in which a signature could resemble many reference samples or might differ because of handwriting variances. By presenting the features and similarity score of signatures from the matching algorithm as fuzzy sets and capturing the degrees of membership, non-membership, and indeterminacy, a neutrosophic engine can significantly contribute to signature verification by addressing the inherent uncertainties and ambiguities present in signatures. But type-1 neutrosophic logic gives these membership functions fixed values, which could not adequately capture the various degrees of uncertainty in the characteristics of signatures. Type-1 neutrosophic representation is also unable to adjust to various

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Sun Dec 01 2013
Journal Name
Diyala Journal Of Engineering Sciences
Design and Simulation of parallel CDMA System Based on 3D-Hadamard Transform
...Show More Authors

Future wireless systems aim to provide higher transmission data rates, improved spectral efficiency and greater capacity. In this paper a spectral efficient two dimensional (2-D) parallel code division multiple access (CDMA) system is proposed for generating and transmitting (2-D CDMA) symbols through 2-D Inter-Symbol Interference (ISI) channel to increase the transmission speed. The 3D-Hadamard matrix is used to generate the 2-D spreading codes required to spread the two-dimensional data for each user row wise and column wise. The quadrature amplitude modulation (QAM) is used as a data mapping technique due to the increased spectral efficiency offered. The new structure simulated using MATLAB and a comparison of performance for ser

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Mar 01 2018
Journal Name
Journal Of Engineering
Implementation of Power System Stabilizer Based on Conventional and Fuzzy Logic Controllers
...Show More Authors

To damp the low-frequency oscillations which occurred due to the disturbances in the electrical power system, the generators are equipped with Power System Stabilizer (PSS) that provide supplementary feedback stabilizing signals. The low-frequency oscillations in power system are classified as local mode oscillations, intra-area mode oscillation, and interarea mode oscillations. A suitable PSS model was selected considering the low frequencies oscillation in the inter-area mode based on conventional PSS and Fuzzy Logic Controller. Two types of (FIS) Mamdani and suggeno were considered in this paper. The software of the methods was executed using MATLAB R2015a package.

 

 

View Publication Preview PDF
Crossref (6)
Crossref
Publication Date
Fri Mar 15 2024
Journal Name
Iraqi Statisticians Journal
Estimate a nonparametric copula density function based on probit and wavelet transforms
...Show More Authors

This study employs wavelet transforms to address the issue of boundary effects. Additionally, it utilizes probit transform techniques, which are based on probit functions, to estimate the copula density function. This estimation is dependent on the empirical distribution function of the variables. The density is estimated within a transformed domain. Recent research indicates that the early implementations of this strategy may have been more efficient. Nevertheless, in this work, we implemented two novel methodologies utilizing probit transform and wavelet transform. We then proceeded to evaluate and contrast these methodologies using three specific criteria: root mean square error (RMSE), Akaike information criterion (AIC), and log

... Show More
View Publication
Crossref
Publication Date
Mon Apr 15 2024
Journal Name
Journal Of Engineering Science And Technology
Text Steganography Based on Arabic Characters Linguistic Features and Word Shifting Method
...Show More Authors

In the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn

... Show More
Publication Date
Sun Dec 31 2023
Journal Name
Iraqi Journal Of Information And Communication Technology
EEG Signal Classification Based on Orthogonal Polynomials, Sparse Filter and SVM Classifier
...Show More Authors

This work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it

... Show More
View Publication Preview PDF
Crossref (4)
Crossref
Publication Date
Mon Dec 31 2018
Journal Name
Journal Of Theoretical And Applied Information Technology
Fingerprints Identification and Verification Based on Local Density Distribution with Rotation Compensation
...Show More Authors

The fingerprints are the more utilized biometric feature for person identification and verification. The fingerprint is easy to understand compare to another existing biometric type such as voice, face. It is capable to create a very high recognition rate for human recognition. In this paper the geometric rotation transform is applied on fingerprint image to obtain a new level of features to represent the finger characteristics and to use for personal identification; the local features are used for their ability to reflect the statistical behavior of fingerprint variation at fingerprint image. The proposed fingerprint system contains three main stages, they are: (i) preprocessing, (ii) feature extraction, and (iii) matching. The preprocessi

... Show More
View Publication Preview PDF