Compressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of compression ratio (CR), mean square error (MSE) and peak signal to noise ratio (PSNR) are assessed. Computer simulation results indicate that the two dimensions MCT offer a better compression ratio, MSE and PSNR than DWT.
This study examines postgraduate students’ awareness of pragmatic aspects, including Grice Maxims, Politeness, and Direct and Indirect forms of speech. According to Paul Grice’s theory of implicature, which is considered one of the most important contributions to pragmatics, this paper discusses how postgraduate students can meet the cooperative principle when communicating effectively. It also outlines how does politeness principles influence obeying or violating the maxims and how is the use of direct or indirect forms of utterances prompted by politeness. Sixteen master’s students of Linguistics and Literature were asked to take a multiple-choice test. The test will be represented along with the interpretation of each optio
... Show More This paper introduces a relation between resultant and the Jacobian determinant
by generalizing Sakkalis theorem from two polynomials in two variables to the case of (n) polynomials in (n) variables. This leads us to study the results of the type: , and use this relation to attack the Jacobian problem. The last section shows our contribution to proving the conjecture.
In the last years, new non-invasively laser methods were used to detect breast tumors for pre- and postmenopausal females. The methods based on using laser radiation are safer than the other daily used methods for breast tumor detection like X-ray mammography, CT-scanner, and nuclear medicine.
One of these new methods is called FDPM (Frequency Domain Photon Migration). It is based on the modulation of laser beam by variable frequency sinusoidal waves. The modulated laser radiations illuminate the breast tissue and received from opposite side.
In this paper the amplitude and the phase shift of the received signal were calculated according to the orig
... Show Morethe banks are one of the public services that must be available in the city to ensure easy financial dealings between citizens and state departments and between the state departments with each other and between the citizens themselves and to ensure easy access to it, so it is very important to choose the best location for the bank, which can serve the largest number of The population achieves easy access. Due to the difficulty of obtaining accurate information dealing with the exact coordinates and according to the country's specific projection, the researcher will resort to the default work using some of the files available in the arcview program
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreElzaki Transform Adomian decomposition technique (ETADM), which an elegant combine, has been employed in this work to solve non-linear Riccati matrix differential equations. Solutions are presented to demonstrate the relevance of the current approach. With the use of figures, the results of the proposed strategy are displayed and evaluated. It is demonstrated that the suggested approach is effective, dependable, and simple to apply to a range of related scientific and technical problems.
The Field Programmable Gate Array (FPGA) approach is the most recent category, which takes the place in the implementation of most of the Digital Signal Processing (DSP) applications. It had proved the capability to handle such problems and supports all the necessary needs like scalability, speed, size, cost, and efficiency.
In this paper a new proposed circuit design is implemented for the evaluation of the coefficients of the two-dimensional Wavelet Transform (WT) and Wavelet Packet Transform (WPT) using FPGA is provided.
In this implementation the evaluations of the WT & WPT coefficients are depending upon filter tree decomposition using the 2-D discrete convolution algorithm. This implementation w
... Show MoreAutomatic recognition of individuals is very important in modern eras. Biometric techniques have emerged as an answer to the matter of automatic individual recognition. This paper tends to give a technique to detect pupil which is a mixture of easy morphological operations and Hough Transform (HT) is presented in this paper. The circular area of the eye and pupil is divided by the morphological filter as well as the Hough Transform (HT) where the local Iris area has been converted into a rectangular block for the purpose of calculating inconsistencies in the image. This method is implemented and tested on the Chinese Academy of Sciences (CASIA V4) iris image database 249 person and the IIT Delhi (IITD) iris
... Show MoreThere is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that
... Show More