<span>Digital audio is required to transmit large sizes of audio information through the most common communication systems; in turn this leads to more challenges in both storage and archieving. In this paper, an efficient audio compressive scheme is proposed, it depends on combined transform coding scheme; it is consist of i) bi-orthogonal (tab 9/7) wavelet transform to decompose the audio signal into low & multi high sub-bands, ii) then the produced sub-bands passed through DCT to de-correlate the signal, iii) the product of the combined transform stage is passed through progressive hierarchical quantization, then traditional run-length encoding (RLE), iv) and finally LZW coding to generate the output mate bitstream. The measures Peak signal-to-noise ratio (PSNR) and compression ratio (CR) were used to conduct a comparative analysis for the performance of the whole system. Many audio test samples were utilized to test the performance behavior; the used samples have various sizes and vary in features. The simulation results appear the efficiency of these combined transforms when using LZW within the domain of data compression. The compression results are encouraging and show a remarkable reduction in audio file size with good fidelity.</span>
The behaviour of certain dynamical nonlinear systems are described in term as chaos, i.e., systems' variables change with the time, displaying very sensitivity to initial conditions of chaotic dynamics. In this paper, we study archetype systems of ordinary differential equations in two-dimensional phase spaces of the Rössler model. A system displays continuous time chaos and is explained by three coupled nonlinear differential equations. We study its characteristics and determine the control parameters that lead to different behavior of the system output, periodic, quasi-periodic and chaos. The time series, attractor, Fast Fourier Transformation and bifurcation diagram for different values have been described.
It has been shown in ionospheric research that calculation of the total electron content (TEC) is an important factor in global navigation system. In this study, TEC calculation was performed over Baghdad city, Iraq, using a combination of two numerical methods called composite Simpson and composite Trapezoidal methods. TEC was calculated using the line integral of the electron density derived from the International reference ionosphere IRI2012 and NeQuick2 models from 70 to 2000 km above the earth surface. The hour of the day and the day number of the year, R12, were chosen as inputs for the calculation techniques to take into account latitudinal, diurnal and seasonal variation of TEC. The results of latitudinal variation of TE
... Show MoreHypoxic training, which in turn is one of the methods adopted in sports training methods, especially in activities that depend on the aerobic system in its performance, which includes training with a lack of oxygen by reducing its molecular pressure, since this method targets functional organs and works temporary responses during training and permanent responses After training as an adaptation to these devices as a result of training in this way, the study aimed to identify the effect of hypoxic exercises using the training mask and the extent of the change in some biochemical indicators, in addition to that to identify the effect of these exercises on the indicator of energy expenditure and )VMA) and the achievement of the effectiveness of
... Show MoreIn this research, the performance of a two kind of membrane was examined to recovering the nutrients (protein and lactose) from the whey produced by the soft cheese industry in the General Company for Food Products inAbo-ghraab.Wheyare treated in two stages, the first including press whey into micron filter made of poly vinylidene difluoride (PVDF) standard plate type 800 kilo dalton, The membrane separates the whey to permeate which represent is the main nutrients and to remove the fat and microorganisms.The second stage is to isolate the protein by using ultra filter made of polyethylsulphone(PES)type plate with a measurement of 10,60 kilo dalton and the recovery of lactose in the form of permeate.
The results showed that the percen
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show More