Preferred Language
Articles
/
4BbYYYcBVTCNdQwCW0if
Images Compression using Combined Scheme of Transform Coding
...Show More Authors

Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used to perform a comparative analysis of the performance of the whole system. Several image test samples were used to test the performance behavior. The simulation results show the efficiency of these combined transformations when LZW is used in the field of data compression. Compression outcomes are encouraging and display a significant reduction in image file size at good resolution.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Jul 01 2019
Journal Name
International Journal Of Computer Science And Mobile Computing
Color Image Compression of Inter-Prediction Base
...Show More Authors

Publication Date
Wed Jan 01 2020
Journal Name
International Journal Of Software & Hardware Research In Engineering
Frontal Facial Image Compression of Hybrid Base
...Show More Authors

Publication Date
Mon Feb 04 2019
Journal Name
Journal Of The College Of Education For Women
Image Watermarking based on Huffman Coding and Laplace Sharpening
...Show More Authors

In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform

... Show More
View Publication Preview PDF
Publication Date
Sat Feb 09 2019
Journal Name
Journal Of The College Of Education For Women
Hybrid Transform Based Denoising with Block Thresholding
...Show More Authors

A frequently used approach for denoising is the shrinkage of coefficients of the noisy signal representation in a transform domain. This paper proposes an algorithm based on hybrid transform (stationary wavelet transform proceeding by slantlet transform); The slantlet transform is applied to the approximation subband of the stationary wavelet transform. BlockShrink thresholding technique is applied to the hybrid transform coefficients. This technique can decide the optimal block size and thresholding for every wavelet subband by risk estimate (SURE). The proposed algorithm was executed by using MATLAB R2010aminimizing Stein’s unbiased with natural images contaminated by white Gaussian noise. Numerical results show that our algorithm co

... Show More
View Publication Preview PDF
Publication Date
Tue Sep 11 2018
Journal Name
Iraqi Journal Of Physics
Contrast enhancement of infrared images using Adaptive Histogram Equalization (AHE) with Contrast Limited Adaptive Histogram Equalization (CLAHE)
...Show More Authors

The objective of this paper is to improve the general quality of infrared images by proposes an algorithm relying upon strategy for infrared images (IR) enhancement. This algorithm was based on two methods: adaptive histogram equalization (AHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE). The contribution of this paper is on how well contrast enhancement improvement procedures proposed for infrared images, and to propose a strategy that may be most appropriate for consolidation into commercial infrared imaging applications.
The database for this paper consists of night vision infrared images were taken by Zenmuse camera (FLIR Systems, Inc) attached on MATRIC100 drone in Karbala city. The experimental tests showed sign

... Show More
View Publication Preview PDF
Crossref (13)
Crossref
Publication Date
Wed Jul 01 2015
Journal Name
Magnetic Resonance Imaging
Alpha shape theory for 3D visualization and volumetric measurement of brain tumor progression using magnetic resonance images
...Show More Authors

Alpha shape theory for 3D visualization and volumetric measurement of brain tumor progression using magnetic resonance images

View Publication
Scopus (33)
Crossref (29)
Scopus Clarivate Crossref
Publication Date
Sun Apr 30 2023
Journal Name
Iraqi Journal Of Science
Numerical and Analytical Solutions of Space-Time Fractional Partial Differential Equations by Using a New Double Integral Transform Method
...Show More Authors

  This work discusses the beginning of fractional calculus and how the Sumudu and Elzaki transforms are applied to fractional derivatives. This approach combines a double Sumudu-Elzaki transform strategy to discover analytic solutions to space-time fractional partial differential equations in Mittag-Leffler functions subject to initial and boundary conditions. Where this method gets closer and closer to the correct answer, and the technique's efficacy is demonstrated using numerical examples performed with Matlab R2015a.

View Publication
Scopus (6)
Crossref (2)
Scopus Crossref
Publication Date
Thu Dec 31 2020
Journal Name
Journal Of New Theory
Brief review of soft sets and its application in coding theory
...Show More Authors

In this paper, we will focus to one of the recent applications of PU-algebras in the coding theory, namely the construction of codes by soft sets PU-valued functions. First, we shall introduce the notion of soft sets PU-valued functions on PU-algebra and investigate some of its related properties.Moreover, the codes generated by a soft sets PU-valued function are constructed and several examples are given. Furthermore, example with graphs of binary block code constructed from a soft sets PU-valued function is constructed.

Publication Date
Wed Jan 01 2025
Journal Name
Current Neuropharmacology
Ischemic Stroke and Autophagy: The Roles of Long Non-Coding RNAs
...Show More Authors
:

Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating auto

... Show More
View Publication
Scopus (9)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sun Mar 20 2016
Journal Name
Al-academy
Indicative coding of the actor’s performance in the Iraqi theater show
...Show More Authors

View Publication
Crossref