Various speech enhancement Algorithms (SEA) have been developed in the last few decades. Each algorithm has its advantages and disadvantages because the speech signal is affected by environmental situations. Distortion of speech results in the loss of important features that make this signal challenging to understand. SEA aims to improve the intelligibility and quality of speech that different types of noise have degraded. In most applications, quality improvement is highly desirable as it can reduce listener fatigue, especially when the listener is exposed to high noise levels for extended periods (e.g., manufacturing). SEA reduces or suppresses the background noise to some degree, sometimes called noise suppression algorithms. In this research, the design of SEA based on different speech models (Laplacian model or Gaussian model) has been implemented using two types of discrete transforms, which are Discrete Tchebichef Transform and Discrete Tchebichef-Krawtchouk Transforms. The proposed estimator consists of dual stages of a wiener filter that can effectively estimate the clean speech signal. The evaluation measures' results show the proposed SEA's ability to enhance the noisy speech signal based on a comparison with other types of speech models and a self-comparison based on different types and levels of noise. The presented algorithm's improvements ratio regarding the average SNRseq are 1.96, 2.12, and 2.03 for Buccaneer, White, and Pink noise, respectively.
Abstractin adult typical atrioventricular nodal reentrant tachycardia (AVNRT) consider most common paroxysmalsupraventricular tachycardia. Dual pathway idea still accepted and used widely and commonly. According tothe guide line, ablations of slow pathway still the first treatment with good success rate.Identify the electrophysiological difference of atrioventricular nodal pathways pre and post ablation.Electrophysiological study was done to 54 patients with only typical type AVNRTs; they were 40 (74%)females and 14 (26%) males. Divided into two groups G1 with 38 patients (70.4%) having one pathway andG2 with 16 patients (29.6%) with multiple pathway. After induction we study the clinical andelectrophysiological feature of tachycardia and s
... Show MoreIn this research, the one of the most important model and widely used in many and applications is linear mixed model, which widely used to analysis the longitudinal data that characterized by the repeated measures form .where estimating linear mixed model by using two methods (parametric and nonparametric) and used to estimate the conditional mean and marginal mean in linear mixed model ,A comparison between number of models is made to get the best model that will represent the mean wind speed in Iraq.The application is concerned with 8 meteorological stations in Iraq that we selected randomly and then we take a monthly data about wind speed over ten years Then average it over each month in corresponding year, so we g
... Show MoreMetasurface polarizers are essential optical components in modern integrated optics and play a vital role in many optical applications including Quantum Key Distribution systems in quantum cryptography. However, inverse design of metasurface polarizers with high efficiency depends on the proper prediction of structural dimensions based on required optical response. Deep learning neural networks can efficiently help in the inverse design process, minimizing both time and simulation resources requirements, while better results can be achieved compared to traditional optimization methods. Hereby, utilizing the COMSOL Multiphysics Surrogate model and deep neural networks to design a metasurface grating structure with high extinction rat
... Show MoreCryptography is a method used to mask text based on any encryption method, and the authorized user only can decrypt and read this message. An intruder tried to attack in many manners to access the communication channel, like impersonating, non-repudiation, denial of services, modification of data, threatening confidentiality and breaking availability of services. The high electronic communications between people need to ensure that transactions remain confidential. Cryptography methods give the best solution to this problem. This paper proposed a new cryptography method based on Arabic words; this method is done based on two steps. Where the first step is binary encoding generation used t
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show More