In this paper, a fusion of K models of full-rank weighted nonnegative tensor factor two-dimensional deconvolution (K-wNTF2D) is proposed to separate the acoustic sources that have been mixed in an underdetermined reverberant environment. The model is adapted in an unsupervised manner under the hybrid framework of the generalized expectation maximization and multiplicative update algorithms. The derivation of the algorithm and the development of proposed full-rank K-wNTF2D will be shown. The algorithm also encodes a set of variable sparsity parameters derived from Gibbs distribution into the K-wNTF2D model. This optimizes each sub-model in K-wNTF2D with the required sparsity to model the time-varying variances of the sources in the spectrogram. In addition, an initialization method is proposed to initialize the parameters in the K-wNTF2D. Experimental results on the underdetermined reverberant mixing environment have shown that the proposed algorithm is effective at separating the mixture with an average signal-to-distortion ratio of 3 dB.
Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreProjects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreIn this work, the emission spectra and atomic structure of the aluminum target had been studied theoretically using Cowan code. Cowan code was used to calculate the transitions of electrons between atomic configuration interactions using the mathematical method called (Hartree-Fock). The aluminum target can give a good emission spectrum in the XUV region at 10 nm with oscillator strength of 1.82.
The hydrodynamic properties of laser produced plasma (LPP) were investigated for the purpose of creating a light source working in the EUV region. Such a light source is very important for lithography (semiconductor manufacturing). The improved MEDUSA (Med103) code can calculate the plasma hydrodynamic properties (velocity, electron density,
This paper describes a research effort that aims of developing solar models for housing suitable for the Arabian region since the Arabian Peninsula is excelled with very high levels of solar radiation.
The current paper is focused on achieving energy efficiency through utilizing solar energy and conserving energy. This task can be accomplished by implementation the major elements related to energy efficiency in housing design , such as embark on an optimum photovoltaic system orientation to maximize seize solar energy and produce solar electricity. All the precautions were taken to minimizing the consumption of solar energy for providing the suitable air-condition to the inhibitor of the solar house in addition to use of energy effici
The term "tight reservoir" is commonly used to refer to reservoirs with low permeability. Tight oil reservoirs have caused worry owing to its considerable influence upon oil output throughout the petroleum sector. As a result of its low permeability, producing from tight reservoirs presents numerous challenges. Because of their low permeability, producing from tight reservoirs is faced with a variety of difficulties. The research aim is to performing hydraulic fracturing treatment in single vertical well in order to study the possibility of fracking in the Saady reservoir. Iraq's Halfaya oil field's Saady B reservoir is the most important tight reservoir. The diagnostic fracture injection test is determined for HF55using GOHFER soft
... Show MoreIn this work, the study of
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More