<span>Digital audio is required to transmit large sizes of audio information through the most common communication systems; in turn this leads to more challenges in both storage and archieving. In this paper, an efficient audio compressive scheme is proposed, it depends on combined transform coding scheme; it is consist of i) bi-orthogonal (tab 9/7) wavelet transform to decompose the audio signal into low & multi high sub-bands, ii) then the produced sub-bands passed through DCT to de-correlate the signal, iii) the product of the combined transform stage is passed through progressive hierarchical quantization, then traditional run-length encoding (RLE), iv) and finally LZW coding to generate the output mate bitstream. The measures Peak signal-to-noise ratio (PSNR) and compression ratio (CR) were used to conduct a comparative analysis for the performance of the whole system. Many audio test samples were utilized to test the performance behavior; the used samples have various sizes and vary in features. The simulation results appear the efficiency of these combined transforms when using LZW within the domain of data compression. The compression results are encouraging and show a remarkable reduction in audio file size with good fidelity.</span>
Steganography art is a technique for hiding information where the unsuspicious cover signal carrying the secret information. Good steganography technique must be includes the important criterions robustness, security, imperceptibility and capacity. The improving each one of these criterions is affects on the others, because of these criterions are overlapped each other. In this work, a good high capacity audio steganography safely method has been proposed based on LSB random replacing of encrypted cover with encrypted message bits at random positions. The research also included a capacity studying for the audio file, speech or music, by safely manner to carrying secret images, so it is difficult for unauthorized persons to suspect
... Show MoreAims current research to identify the mistakes coding contained in the reading first grade. Encoding knew that he had failed in Retrieval or identifying information, the researcher diagnosis of mistakes and presented to a group of teachers first grade and they have an appropriate adjustment and using the percentage shows that the agreement on the mistakes ratio and adjusted researcher recommended a set of proposals and recommendations can work out the future for the advancement of scientific level
Article information: COVID-19 has roused the scientic community, prompting calls for immediate solutions to avoid the infection or at least reduce the virus's spread. Despite the availability of several licensed vaccinations to boost human immunity against the disease, various mutated strains of the virus continue to emerge, posing a danger to the vaccine's ecacy against new mutations. As a result, the importance of the early detection of COVID-19 infection becomes evident. Cough is a prevalent symptom in all COVID-19 mutations. Unfortunately, coughing can be a symptom of various of diseases, including pneumonia and inuenza. Thus, identifying the coughing behavior might help clinicians diagnose the COVID-19 infection earlier and distinguish
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreIn the present work a theoretical analysis depending on the new higher order . element in shear deformation theory for simply supported cross-ply laminated plate is developed. The new displacement field of the middle surface expanded as a combination of exponential and trigonometric function of thickness coordinate with the transverse displacement taken to be constant through the thickness. The governing equations are derived using Hamilton’s principle and solved using Navier solution method to obtain the deflection and stresses under uniform sinusoidal load. The effect of many design parameters such as number of laminates, aspect ratio and thickness ratio on static behavior of the laminated composite plate has been studied. The
... Show MoreThis paper is concerned with combining two different transforms to present a new joint transform FHET and its inverse transform IFHET. Also, the most important property of FHET was concluded and proved, which is called the finite Hankel – Elzaki transforms of the Bessel differential operator property, this property was discussed for two different boundary conditions, Dirichlet and Robin. Where the importance of this property is shown by solving axisymmetric partial differential equations and transitioning to an algebraic equation directly. Also, the joint Finite Hankel-Elzaki transform method was applied in solving a mathematical-physical problem, which is the Hotdog Problem. A steady state which does not depend on time was discussed f
... Show MoreThe demand for electronic -passport photo ( frontal facial) images has grown rapidly. It now extends to Electronic Government (E-Gov) applications such as social benefits driver's license, e-passport, and e-visa . With the COVID 19 (coronavirus disease ), facial (formal) images are becoming more widely used and spreading quickly, and are being used to verify an individual's identity, but unfortunately that comes with insignificant details of constant background which leads to huge byte consumption that affects storage space and transmission, where the optimal solution that aims to curtail data size using compression techniques that based on exploiting image redundancy(s) efficiently.