<span>Digital audio is required to transmit large sizes of audio information through the most common communication systems; in turn this leads to more challenges in both storage and archieving. In this paper, an efficient audio compressive scheme is proposed, it depends on combined transform coding scheme; it is consist of i) bi-orthogonal (tab 9/7) wavelet transform to decompose the audio signal into low & multi high sub-bands, ii) then the produced sub-bands passed through DCT to de-correlate the signal, iii) the product of the combined transform stage is passed through progressive hierarchical quantization, then traditional run-length encoding (RLE), iv) and finally LZW coding to generate the output mate bitstream. The measures Peak signal-to-noise ratio (PSNR) and compression ratio (CR) were used to conduct a comparative analysis for the performance of the whole system. Many audio test samples were utilized to test the performance behavior; the used samples have various sizes and vary in features. The simulation results appear the efficiency of these combined transforms when using LZW within the domain of data compression. The compression results are encouraging and show a remarkable reduction in audio file size with good fidelity.</span>
This article aim to estimate the Return Stock Rate of the private banking sector, with two banks, by adopting a Partial Linear Model based on the Arbitrage Pricing Model (APT) theory, using Wavelet and Kernel Smoothers. The results have proved that the wavelet method is the best. Also, the results of the market portfolio impact and inflation rate have proved an adversely effectiveness on the rate of return, and direct impact of the money supply.
Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreCloud point extraction is a simple, safe, and environmentally friendly technique for preparing many different kinds of samples. In this review, we discussed the CPE method and how to apply it to our environmental sample data. We also spoke about the benefits, problems, and likely developments in CPE. This process received a great deal of attention during preconcentration and extraction. It was used as a disconnection and follow-up improvement system before the natural mixtures (nutrients, polybrominated biphenyl ethers, pesticides, polycyclic sweet-smelling hydrocarbons, polychlorinated compounds, and fragrant amines) and inorganic mixtures were examined and many metals like (silver, lead, cadmium, mercury, and so on). We also find
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreA liquid-solid chromatography of Bovine Serum Albumin (BSA) on (diethylaminoethyl-cellulose) DEAE-cellulose adsorbent is worked experimentally, to study the effect of changing the influent concentration of (0.125, 0.25, 0.5, and 1 mg/ml) at constant volumetric flow rate Q=1ml/min. And the effect of changing the volumetric flow rate (1, 3, 5, and 10 ml/min) at constant influent concentration of Co=0.125mg/ml. By using a glass column of (1.5cm) I.D and (50cm) length, packed with adsorbent of DEAE-cellulose of height (7cm). The influent is introduced in to the column using peristaltic pump and the effluent concentration is investigated using UV-spectrophotometer at 30oC and 280nm wavelength. A spread (steeper) break-through curve is gained
... Show MoreAlthough the Wiener filtering is the optimal tradeoff of inverse filtering and noise smoothing, in the case when the blurring filter is singular, the Wiener filtering actually amplify the noise. This suggests that a denoising step is needed to remove the amplified noise .Wavelet-based denoising scheme provides a natural technique for this purpose .
In this paper a new image restoration scheme is proposed, the scheme contains two separate steps : Fourier-domain inverse filtering and wavelet-domain image denoising. The first stage is Wiener filtering of the input image , the filtered image is inputted to adaptive threshold wavelet
... Show MoreDue to the large population of motorway users in the country of Iraq, various approaches have been adopted to manage queues such as implementation of traffic lights, avoidance of illegal parking, amongst others. However, defaulters are recorded daily, hence the need to develop a mean of identifying these defaulters and bring them to book. This article discusses the development of an approach of recognizing Iraqi licence plates such that defaulters of queue management systems are identified. Multiple agencies worldwide have quickly and widely adopted the recognition of a vehicle license plate technology to expand their ability in investigative and security matters. License plate helps detect the vehicle's information automatically ra
... Show More