Preferred Language
Articles
/
1BcCMI8BVTCNdQwCx165
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless compression scheme of first stage that corresponding to second stage. The tested results shown are promising  in both two stages, that implicilty enhanced the performance of traditional polynomial model in terms of compression ratio , and preresving image quality.

Crossref
View Publication
Publication Date
Fri Sep 01 2023
Journal Name
Al-khwarizmi Engineering Journal
High Transaction Rates Performance Evaluation for Secure E-government Based on Private Blockchain Scheme
...Show More Authors

 

The implementation of technology in the provision of public services and communication to citizens, which is commonly referred to as e-government, has brought multitude of benefits, including enhanced efficiency, accessibility, and transparency. Nevertheless, this approach also presents particular security concerns, such as cyber threats, data breaches, and access control. One technology that can aid in mitigating the effects of security vulnerabilities within e-government is permissioned blockchain. This work examines the performance of the hyperledger fabric private blockchain under high transaction loads by analyzing two scenarios that involve six organizations as case studies. Several parameters, such as transaction send ra

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Wed Jun 01 2022
Journal Name
Baghdad Science Journal
WOAIP: Wireless Optimization Algorithm for Indoor Placement Based on Binary Particle Swarm Optimization (BPSO)
...Show More Authors

Optimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s

... Show More
View Publication Preview PDF
Scopus (7)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Wed Mar 16 2022
Journal Name
International Journal Of Recent Contributions From Engineering, Science & It
Smart Learning based on Moodle E-learning Platform and Digital Skills for University Students
...Show More Authors

Publication Date
Fri Nov 04 2022
Journal Name
Journal Of Optics
Coreless optical fiber for hemoglobin (HB) sensing with bilayer based on surface plasmon resonance
...Show More Authors

In this work, an optical fiber biomedical sensor for detecting the ratio of the hemoglobin in the blood is presented. A surface plasmon resonance (SPR)-based coreless optical fiber was developed and implemented using single- and multi-mode optical fibers. The sensor is also utilized to evaluate refractive indices and concentrations of hemoglobin in blood samples, with 40 nm thickness of (20 nm Au and 20 nm Ag) to increase the sensitivity. It is found in practice that when the sensitive refractive index increases, the resonant wavelength increases due to the decrease in energy.

View Publication Preview PDF
Scopus (6)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Ieee Access
Microwave Nondestructive Testing for Defect Detection in Composites Based on K-Means Clustering Algorithm
...Show More Authors

View Publication
Scopus (64)
Crossref (63)
Scopus Clarivate Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Exploring Important Factors in Predicting Heart Disease Based on Ensemble- Extra Feature Selection Approach
...Show More Authors

Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (4)
Scopus Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Iaes International Journal Of Artificial Intelligence (ij-ai)
Innovations in t-way test creation based on a hybrid hill climbing-greedy algorithm
...Show More Authors

<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T

... Show More
View Publication
Scopus (5)
Crossref (5)
Scopus Crossref
Publication Date
Sun Dec 03 2017
Journal Name
Baghdad Science Journal
Network Self-Fault Management Based on Multi-Intelligent Agents and Windows Management Instrumentation (WMI)
...Show More Authors

This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
An algorithm for binary codebook design based on the average bitmap replacement error (ABPRE)
...Show More Authors

In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Aug 01 2023
Journal Name
Baghdad Science Journal
Digital Data Encryption Using a Proposed W-Method Based on AES and DES Algorithms
...Show More Authors

This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (4)
Scopus Crossref