Various speech enhancement Algorithms (SEA) have been developed in the last few decades. Each algorithm has its advantages and disadvantages because the speech signal is affected by environmental situations. Distortion of speech results in the loss of important features that make this signal challenging to understand. SEA aims to improve the intelligibility and quality of speech that different types of noise have degraded. In most applications, quality improvement is highly desirable as it can reduce listener fatigue, especially when the listener is exposed to high noise levels for extended periods (e.g., manufacturing). SEA reduces or suppresses the background noise to some degree, sometimes called noise suppression algorithms. In this research, the design of SEA based on different speech models (Laplacian model or Gaussian model) has been implemented using two types of discrete transforms, which are Discrete Tchebichef Transform and Discrete Tchebichef-Krawtchouk Transforms. The proposed estimator consists of dual stages of a wiener filter that can effectively estimate the clean speech signal. The evaluation measures' results show the proposed SEA's ability to enhance the noisy speech signal based on a comparison with other types of speech models and a self-comparison based on different types and levels of noise. The presented algorithm's improvements ratio regarding the average SNRseq are 1.96, 2.12, and 2.03 for Buccaneer, White, and Pink noise, respectively.
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based
... Show MoreIn this article, Convolution Neural Network (CNN) is used to detect damage and no damage images form satellite imagery using different classifiers. These classifiers are well-known models that are used with CNN to detect and classify images using a specific dataset. The dataset used belongs to the Huston hurricane that caused several damages in the nearby areas. In addition, a transfer learning property is used to store the knowledge (weights) and reuse it in the next task. Moreover, each applied classifier is used to detect the images from the dataset after it is split into training, testing and validation. Keras library is used to apply the CNN algorithm with each selected classifier to detect the images. Furthermore, the performa
... Show MoreThe importance of forecasting has emerged in the economic field in order to achieve economic growth, as forecasting is one of the important topics in the analysis of time series, and accurate forecasting of time series is one of the most important challenges in which we seek to make the best decision. The aim of the research is to suggest the use of hybrid models for forecasting the daily crude oil prices as the hybrid model consists of integrating the linear component, which represents Box Jenkins models and the non-linear component, which represents one of the methods of artificial intelligence, which is long short term memory (LSTM) and the gated recurrent unit (GRU) which represents deep learning models. It was found that the proposed h
... Show MoreThere are many techniques that can be used to estimate the spray quality traits such as the spray coverage, droplet density, droplet count, and droplet diameter. One of the most common techniques is to use water sensitive papers (WSP) as a spray collector on field conditions and analyzing them using several software. However, possible merger of some droplets could occur after they deposit on WSP, and this could affect the accuracy of the results. In this research, image processing technique was used for better estimation of the spray traits, and to overcome the problem of droplet merger. The droplets were classified as non-merged and merged droplets based on their roundness, then the merged droplets were separated based on the average non-m
... Show Moreبهذا البحث نقارن معاييرالمعلومات التقليدية (AIC , SIC, HQ , FPE ) مع معيارمعلومات الانحراف المحور (MDIC) المستعملة لتحديد رتبة انموذج الانحدارالذاتي (AR) للعملية التي تولد البيانات,باستعمال المحاكاة وذلك بتوليد بيانات من عدة نماذج للأنحدارالذاتي,عندما خضوع حد الخطأ للتوزيع الطبيعي بقيم مختلفة لمعلماته
... Show MoreReceipt date:06/23/2020 accepted date:7/15/2020 Publication date:12/31/2021
This work is licensed under a Creative Commons Attribution 4.0 International License
The executive authority differs from one country to another, as it differs from a federal state to another according to the nature of the applied political systems, so this research focused on federal states according to their political systems, then going into the details of the executive authority and its role In the federal states by referring to the four federal experiments
... Show MoreTraditionally, path selection within routing is formulated as a shortest path optimization problem. The objective function for optimization could be any one variety of parameters such as number of hops, delay, cost...etc. The problem of least cost delay constraint routing is studied in this paper since delay constraint is very common requirement of many multimedia applications and cost minimization captures the need to
distribute the network. So an iterative algorithm is proposed in this paper to solve this problem. It is appeared from the results of applying this algorithm that it gave the optimal path (optimal solution) from among multiple feasible paths (feasible solutions).
Many of accurate inertial guided missilc systems need to use more complex mathematical calculations and require a high speed processing to ensure the real-time opreation. This will give rise to the need of developing an effcint