Home New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet transform. The proposed audio compression system consists of the following steps: (1) load digital audio data, (2) transformation (i.e., using bi-orthogonal wavelet or discrete cosine transform) to decompose the audio signal, (3) quantization (depend on the used transform), (4) quantization of the quantized data that separated into two sequence vectors; runs and non-zeroes decomposition to apply the run length to reduce the long-run sequence. Each resulted vector is passed into the entropy encoder technique to implement a compression process. In this paper, two entropy encoders are used; the first one is the lossless compression method LZW and the second one is an advanced version for the traditional shift coding method called the double shift coding method. The proposed system performance is analyzed using distinct audio samples of different sizes and characteristics with various audio signal parameters. The performance of the compression system is evaluated using Peak Signal to Noise Ratio and Compression Ratio. The outcomes of audio samples show that the system is simple, fast and it causes better compression gain. The results show that the DSC encoding time is less than the LZW encoding time.
In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreThis research aims to distinguish the reef environment from the non-reef environment. The Oligocene-Miocene-succussion in western Iraq was selected as a case study, represented by the reefal limestone facies of the Anah Formation (Late Oligocene) deposited in reef-back reef environments, dolomitic limestone of the Euphrates Formation (Early Miocene) deposited in open sea environments, and gypsiferous marly limestone of the Fatha Formation (Middle Miocene) deposited in a lagoonal environment. The content of the rare earth elements (REEs) (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Er, Ho, Tm, Yb, Lu, and Y) in reef facies appear to be much lower than of those in the non-reef facies. The open sea facies have a low content of REEs due to bein
... Show MoreThe aim of this research is to compare traditional and modern methods to obtain the optimal solution using dynamic programming and intelligent algorithms to solve the problems of project management.
It shows the possible ways in which these problems can be addressed, drawing on a schedule of interrelated and sequential activities And clarifies the relationships between the activities to determine the beginning and end of each activity and determine the duration and cost of the total project and estimate the times used by each activity and determine the objectives sought by the project through planning, implementation and monitoring to maintain the budget assessed
... Show MoreNano- particles (Ag NPs) are synthesized by using plasma Jet argon gas. The prepared Ag NPs are characterized by Atomic Absorption Spectroscopy (AAS) The measure was performed for different time exposuring 15,30,45 and 60 sec. The results shows the low concentration of nano-silver time expose (15 sec) and very) and high concentration at 60 sec. The UV-VIS spectrometer for nano-silver different time exsposuring to plasma, shows the Surface Plasmon Resonance (SPR) appeared around 419 nm, and the energy gab is 4.1 eV for the 15 second exposure and 1.6eV for 60 second exposure. The Scanning Probe Microscope (SPM) is used to identify the characterization of silver nanoparticles, the average diameter of nano-silver for 15 second exp
... Show MoreRecently, the development and application of the hydrological models based on Geographical Information System (GIS) has increased around the world. One of the most important applications of GIS is mapping the Curve Number (CN) of a catchment. In this research, three softwares, such as an ArcView GIS 9.3 with ArcInfo, Arc Hydro Tool and Geospatial Hydrologic Modeling Extension (Hec-GeoHMS) model for ArcView GIS 9.3, were used to calculate CN of (19210 ha) Salt Creek watershed (SC) which is located in Osage County, Oklahoma, USA. Multi layers were combined and examined using the Environmental Systems Research Institute (ESRI) ArcMap 2009. These layers are soil layer (Soil Survey Geographic SSURGO), 30 m x 30 m resolution of Digital Elevati
... Show MoreA super pixel can be defined as a group of pixels, which have similar characteristics, which can be very helpful for image segmentation. It is generally color based segmentation as well as other features like texture, statistics…etc .There are many algorithms available to segment super pixels like Simple Linear Iterative Clustering (SLIC) super pixels and Density-Based Spatial Clustering of Application with Noise (DBSCAN). SLIC algorithm essentially relay on choosing N random or regular seeds points covering the used image for segmentation. In this paper Split and Merge algorithm was used instead to overcome determination the seed point's location and numbers as well as other used parameters. The overall results were better from the SL
... Show MoreThis paper is focusing on reducing the time for text processing operations by taking the advantage of enumerating each string using the multi hashing methodology. Text analysis is an important subject for any system that deals with strings (sequences of characters from an alphabet) and text processing (e.g., word-processor, text editor and other text manipulation systems). Many problems have been arisen when dealing with string operations which consist of an unfixed number of characters (e.g., the execution time); this due to the overhead embedded-operations (like, symbols matching and conversion operations). The execution time largely depends on the string characteristics; especially its length (i.e., the number of characters consisting
... Show MoreA study is made about the size and dynamic activity of sunspot using automatically detecting Matlab code ''mySS .m'' written for this purpose which mainly finds a good estimate about Sunspot diameter (in km). Theory of the Sunspot size has been described using equations, where the growth and decay phases and the area of Sunspot could be calculated. Two types of images, namely H-alpha and HMI magnetograms, have been implemented. The results are divided into four main parts. The first part is sunspot size automatic detection by the Matlab program. The second part is numerical calculations of Sunspot growth and decay phases. The third part is the calculation of Sunspot area. The final part is to explain the Sunspot activit
... Show More