Home New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet transform. The proposed audio compression system consists of the following steps: (1) load digital audio data, (2) transformation (i.e., using bi-orthogonal wavelet or discrete cosine transform) to decompose the audio signal, (3) quantization (depend on the used transform), (4) quantization of the quantized data that separated into two sequence vectors; runs and non-zeroes decomposition to apply the run length to reduce the long-run sequence. Each resulted vector is passed into the entropy encoder technique to implement a compression process. In this paper, two entropy encoders are used; the first one is the lossless compression method LZW and the second one is an advanced version for the traditional shift coding method called the double shift coding method. The proposed system performance is analyzed using distinct audio samples of different sizes and characteristics with various audio signal parameters. The performance of the compression system is evaluated using Peak Signal to Noise Ratio and Compression Ratio. The outcomes of audio samples show that the system is simple, fast and it causes better compression gain. The results show that the DSC encoding time is less than the LZW encoding time.
The increased use of hybrid PET /CT scanners combining detailed anatomical information along withfunctional data has benefits for both diagnostic and therapeutic purposes. This presented study is to makecomparison of cross sections to produce 18F , 82Sr and68Ge via different reactions with particle incident energy up to 60 MeV as a part of systematic studies on particle-induced activations on enriched natNe, natRb, natGa 18O,85Rb, and 69Ga targets, theoretical calculation of production yield, calculation of requiredtarget and suggestion of optimum reaction to produce: Fluorine-18 , Strontium-82 andGermanium-68 touse in Hybrid Machines PET/CT Scanners.
Perchloroethylene (PERC) is commonly used as a dry-cleaning solvent, it is attributed to many deleterious effects in the biological system. The study aimed to investigate the harmful effect associated with PERC exposure among dry-cleaning workers. The study was carried out on 58 adults in two groups. PERC-exposed group; include thirty-two male dry-cleaning workers using PERC as a dry-cleaning solvent and twenty-six healthy non-exposed subjects. History of PERC exposure, use of personal protection equipment (PPE), safety measurement of the exposed group was recorded. Blood sample was taken from each participant for measurement of hematological markers, liver and kidney function tests. The results showed that 28.1% of the workers were usin
... Show MoreIn this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.
Text categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreWind energy is one of the most common and natural resources that play a huge role in energy sector, and due to the increasing demand to improve the efficiency of wind turbines and the development of the energy field, improvements have been made to design a suitable wind turbine and obtain the most energy efficiency possible from wind. In this paper, a horizontal wind turbine blade operating under low wind speed was designed using the (BEM) theory, where the design of the turbine rotor blade is a difficult task due to the calculations involved in the design process. To understand the behavior of the turbine blade, the QBlade program was used to design and simulate the turbine rotor blade during working conditions. The design variables suc
... Show MoreThe aim of this paper is to present a weak form of -light functions by using -open set which is -light function, and to offer new concepts of disconnected spaces and totally disconnected spaces. The relation between them have been studied. Also, a new form of -totally disconnected and inversely -totally disconnected function have been defined, some examples and facts was submitted.
Combining multi-model images of the same scene that have different focus distances can produce clearer and sharper images with a larger depth of field. Most available image fusion algorithms are superior in results. However, they did not take into account the focus of the image. In this paper a fusion method is proposed to increase the focus of the fused image and to achieve highest quality image using the suggested focusing filter and Dual Tree-Complex Wavelet Transform. The focusing filter consist of a combination of two filters, which are Wiener filter and a sharpening filter. This filter is used before the fusion operation using Dual Tree-Complex Wavelet Transform. The common fusion rules, which are the average-fusion rule and maximu
... Show MoreIn this work, a magnetic switch was prepared using two typesof ferrofluid materials, the pure ferrofluid and ferrofluid doped with copper nanoparticles (10 nm). The critical magnetic field (Hc) and the state of magnetic saturation (Hs) were studied using three types of laser sources. The main parameters of the magnetic switch measured using pure ferrofluid and He-Ne Laser source were Hc(0.5 mv, 0.4 G), Hs (8.5 mv, 3 G). For the ferrofluid doped with copper nanoparticles were Hc (1 mv, 4 G), Hs (15 mv, 9.6 G), Using green semiconductor laser for the Pure ferrofluid were Hc (0.5 mv, 0.3 G) Hs (15 mv, 2.9 G). While the ferrofluid doped with copper nanoparticles were Hc (0.5 mv, 1 G), Hs (12 mv, 2.8 G) and by using the violet semiconductor l
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show More