Speech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Krawtchouk-Tchebichef transform (DKTT) has a high energy compaction and provides a high matching between Laplacian density and its coefficients distribution that affects positively on reducing residual noise without sacrificing speech components. Moreover, a cascade combination of hybrid speech estimator is proposed by using two stages filters (non-linear and linear) based on DKTT domain to lessen the residual noise effectively without distorting the speech signal. The linear estimator is considered as a post processing filter that reinforces the suppression of noise by regenerate speech components. To this end, the output results have been compared with existing work in terms of different quality and intelligibility measures. The comparative evaluation confirms the superior achievements of the proposed SEA in various noisy environments. The improvement ratio of the presented algorithm in terms of PESQ measure are 5.8% and 1.8% for white and babble noise environments, respectively. In addition, the improvement ratio of the presented algorithm in terms of OVL measure are 15.7% and 9.8% for white and babble noise environments, respectively.
The rapid development of telemedicine services and the requirements for exchanging medical information between physicians, consultants, and health institutions have made the protection of patients’ information an important priority for any future e-health system. The protection of medical information, including the cover (i.e. medical image), has a specificity that slightly differs from the requirements for protecting other information. It is necessary to preserve the cover greatly due to its importance on the reception side as medical staff use this information to provide a diagnosis to save a patient's life. If the cover is tampered with, this leads to failure in achieving the goal of telemedicine. Therefore, this work provides an in
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreHowever, the effects of these ideas are still evident
Think of those who follow the footsteps of Muslim scholars and thinkers.
Intellectual source of Gnostic Gnar
And in dealing with issues of Islamic thought
Annalisa intersecting trends of thought, Minya who melts in Masarya and supported him and believes in Bo and extremist Vue,
Lea, trying to throw it with a few times, looks at Elya look
There are those who stand in the opposite position
There are those who stand in a selective compromise, but this school of thought remains
G - the features of Islamic thought that believes in the mind and Imoto in the defense of
An intellectual station and generalized bar
Creed. In his study, the researcher wi
Considering the science of speech in the light of its purposes is an accurate scientific study that looks at it from its reality in which it originated, and its topic that it dealt with and its goals that it sought, and it follows its main course in the directions of approving the belief and inferring it, and repelling the objections to it, and this study comes to show the realism of the science of speech in its emergence Its subject and method, since its launch was from the reality of the Islamic nation and based on its intellectual needs, so its presence was necessary in the life of the Islamic nation because of its role in facing the challenges faced by the Islamic faith, and the dangers it was exposed to as a result of the intellectu
... Show MorePolitical speeches are represented in different shapes as political forum, events or as inaugural speech. This research critically analyzes the inaugural Speech of the President Donald Trump which was delivered on 20th ,January, 2017 from the site<www.cnn.com> retrieved on 10th ,May,2017. The objectives of the study are: First: classifying and discussing well known micro structures (linguistic feature) of the speech, and second: classifying the macro structures i.e. the delivered political inaugural speech in which he includes social structures. To reach to the objectives of the study, the researcher will adopt Norman Fairclough’s three dimensional Analytical Model(1989). Tracing the model, the speech was subm
... Show MorePolitical speeches are represented in different shapes as political forum, events or as inaugural speech. This research critically analyzes the inaugural Speech of the President Donald Trump which was delivered on 20th ,January, 2017 from the site<www.cnn.com> retrieved on 10th ,May,2017. The objectives of the study are: First: classifying and discussing well known micro structures (linguistic feature) of the speech, and second: classifying the macro structures i.e. the delivered political inaugural speech in which he includes social structures. To reach to the objectives of the study, the researcher will adopt Norman Fairclough’s three dimensional Analytical Model(
... Show MoreThe application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show More