Preferred Language
Articles
/
8Rb-YYcBVTCNdQwCFEg2
Audio Compression Using Transform Coding with LZW and Double Shift Coding
...Show More Authors

Home New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet transform. The proposed audio compression system consists of the following steps: (1) load digital audio data, (2) transformation (i.e., using bi-orthogonal wavelet or discrete cosine transform) to decompose the audio signal, (3) quantization (depend on the used transform), (4) quantization of the quantized data that separated into two sequence vectors; runs and non-zeroes decomposition to apply the run length to reduce the long-run sequence. Each resulted vector is passed into the entropy encoder technique to implement a compression process. In this paper, two entropy encoders are used; the first one is the lossless compression method LZW and the second one is an advanced version for the traditional shift coding method called the double shift coding method. The proposed system performance is analyzed using distinct audio samples of different sizes and characteristics with various audio signal parameters. The performance of the compression system is evaluated using Peak Signal to Noise Ratio and Compression Ratio. The outcomes of audio samples show that the system is simple, fast and it causes better compression gain. The results show that the DSC encoding time is less than the LZW encoding time.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Jan 04 2018
Journal Name
Journal Of Electrical Engineering And Technology
An efficient selective method for audio watermarking against de-synchronization attacks
...Show More Authors

View Publication
Scopus (7)
Scopus
Publication Date
Sun Jan 24 2016
Journal Name
Al-academy
Shift formal regulations on Mesopotamian pottery (Pottery Samarra model): محمد جاسم محمد العبيدي
...Show More Authors

Operated Alandziahih "drift theory" as a science talk in most of the cash and technical studies in the early twentieth century, making him and the sciences, arts and culture both fields of experiences, in an attempt to explore the institutions that theory, a number of laws which took control in the internal structures of those acts, resulting in for those institutions to be actively contribute their ideas to guide the pace on the right track. And thus lay the foundations of this theory, which was a big affair in the early twentieth century and still vigorous pace to this day, particularly their applications in various fields of the arts.Although each type of Arts, both in the composition or the theater or means of communication, took joi

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 07 2019
Journal Name
Journal Of The College Of Education For Women
EFFICIENCY SPIHT IN COMPRESSION AND QUALITY OF IMAGE
...Show More Authors

Image compression is an important tool to reduce the bandwidth and storage
requirements of practical image systems. To reduce the increasing demand of storage
space and transmission time compression techniques are the need of the day. Discrete
time wavelet transforms based image codec using Set Partitioning In Hierarchical
Trees (SPIHT) is implemented in this paper. Mean Square Error (MSE), Peak Signal
to Noise Ratio (PSNR) and Maximum Difference (MD) are used to measure the
picture quality of reconstructed image. MSE and PSNR are the most common picture
quality measures. Different kinds of test images are assessed in this work with
different compression ratios. The results show the high efficiency of SPIHT algori

... Show More
View Publication Preview PDF
Publication Date
Thu Jun 01 2023
Journal Name
Journal Of Engineering
Fault Location of Doukan-Erbil 132kv Double Transmission Lines Using Artificial Neural Network ANN
...Show More Authors

Transmission lines are generally subjected to faults, so it is advantageous to determine these faults as quickly as possible. This study uses an Artificial Neural Network technique to locate a fault as soon as it happens on the Doukan-Erbil of 132kv double Transmission lines network. CYME 7.1-Programming/Simulink utilized simulation to model the suggested network. A multilayer perceptron feed-forward artificial neural network with a back propagation learning algorithm is used for the intelligence locator's training, testing, assessment, and validation. Voltages and currents were applied as inputs during the neural network's training. The pre-fault and post-fault values determined the scaled values. The neural network's p

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Dec 26 2017
Journal Name
Al-khwarizmi Engineering Journal
FPGA Realization of Two-Dimensional Wavelet and Wavelet Packet Transform
...Show More Authors

 

The Field Programmable Gate Array (FPGA) approach is the most recent category, which takes the place in the implementation of most of the Digital Signal Processing (DSP) applications. It had proved the capability to handle such problems and supports all the necessary needs like scalability, speed, size, cost, and efficiency.

In this paper a new proposed circuit design is implemented for the evaluation of the coefficients of the two-dimensional Wavelet Transform (WT) and Wavelet Packet Transform (WPT) using FPGA is provided.

In this implementation the evaluations of the WT & WPT coefficients are depending upon filter tree decomposition using the 2-D discrete convolution algorithm. This implementation w

... Show More
View Publication Preview PDF
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
Towards Accurate Pupil Detection Based on Morphology and Hough Transform
...Show More Authors

 Automatic recognition of individuals is very important in modern eras. Biometric techniques have emerged as an answer to the matter of automatic individual recognition. This paper tends to give a technique to detect pupil which is a mixture of easy morphological operations and Hough Transform (HT) is presented in this paper. The circular area of the eye and pupil is divided by the morphological filter as well as the Hough Transform (HT) where the local Iris area has been converted into a rectangular block for the purpose of calculating inconsistencies in the image. This method is implemented and tested on the Chinese Academy of Sciences (CASIA V4) iris image database 249 person and the IIT Delhi (IITD) iris

... Show More
View Publication Preview PDF
Scopus (11)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Fri Nov 01 2013
Journal Name
Journal Of Cosmetics, Dermatological Sciences And Applications
Treatment of chronic paronychia: A double blind comparative clinical trial using singly vaseline, nystatin and fucidic acid ointment
...Show More Authors

KE Sharquie, AA Noaimi, SA Galib, Journal of Cosmetics, Dermatological Sciences and Applications, 2013 - Cited by 4

View Publication
Publication Date
Sun Mar 01 2020
Journal Name
Baghdad Science Journal
A Comparative Study on the Double Prior for Reliability Kumaraswamy Distribution with Numerical Solution
...Show More Authors

This work, deals with Kumaraswamy distribution. Kumaraswamy (1976, 1978) showed well known probability distribution functions such as the normal, beta and log-normal but in (1980) Kumaraswamy developed a more general probability density function for double bounded random processes, which is known as Kumaraswamy’s distribution. Classical maximum likelihood and Bayes methods estimator are used to estimate the unknown shape parameter (b). Reliability function are obtained using symmetric loss functions by using three types of informative priors two single priors and one double prior. In addition, a comparison is made for the performance of these estimators with respect to the numerical solution which are found using expansion method. The

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sun Sep 04 2011
Journal Name
Baghdad Science Journal
A Mathematical Approach for Computing the Linear Equivalence of a Periodic Key-Stream Sequence Using Fourier Transform
...Show More Authors

A mathematical method with a new algorithm with the aid of Matlab language is proposed to compute the linear equivalence (or the recursion length) of the pseudo-random key-stream periodic sequences using Fourier transform. The proposed method enables the computation of the linear equivalence to determine the degree of the complexity of any binary or real periodic sequences produced from linear or nonlinear key-stream generators. The procedure can be used with comparatively greater computational ease and efficiency. The results of this algorithm are compared with Berlekamp-Massey (BM) method and good results are obtained where the results of the Fourier transform are more accurate than those of (BM) method for computing the linear equivalenc

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Aug 15 2022
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Passport Photo Compression: A Review
...Show More Authors

The demand for electronic -passport photo ( frontal facial) images has grown rapidly. It now extends to Electronic Government (E-Gov) applications such as social benefits driver's license, e-passport, and e-visa . With the COVID 19 (coronavirus disease ), facial (formal) images are becoming more widely used and spreading quickly, and are being used to verify an individual's identity, but unfortunately that comes with insignificant details of constant background which leads to huge byte consumption that affects storage space and transmission, where the optimal solution that aims to curtail data size using compression techniques that based on exploiting image redundancy(s) efficiently.

View Publication
Crossref