In this paper, a fusion of K models of full-rank weighted nonnegative tensor factor two-dimensional deconvolution (K-wNTF2D) is proposed to separate the acoustic sources that have been mixed in an underdetermined reverberant environment. The model is adapted in an unsupervised manner under the hybrid framework of the generalized expectation maximization and multiplicative update algorithms. The derivation of the algorithm and the development of proposed full-rank K-wNTF2D will be shown. The algorithm also encodes a set of variable sparsity parameters derived from Gibbs distribution into the K-wNTF2D model. This optimizes each sub-model in K-wNTF2D with the required sparsity to model the time-varying variances of the sources in the spectrogram. In addition, an initialization method is proposed to initialize the parameters in the K-wNTF2D. Experimental results on the underdetermined reverberant mixing environment have shown that the proposed algorithm is effective at separating the mixture with an average signal-to-distortion ratio of 3 dB.
Projects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
Green nanotechnology is a thrilling and rising place of technology and generation that bracesthe ideas of inexperienced chemistry with ability advantages for sustainability, protection, andthe general protection from the race human. The inexperienced chemistry method introduces aproper technique for the production, processing, and alertness of much less dangerous chemicalsubstances to lessen threats to human fitness and the environment. The technique calls for inintensity expertise of the uncooked materials, particularly in phrases in their creation intonanomaterials and the resultant bioactivities that pose very few dangerous outcomes for peopleand the environment. In the twenty-first century, nanotechnology has become a systematic
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreRecently, the internet has made the users able to transmit the digital media in the easiest manner. In spite of this facility of the internet, this may lead to several threats that are concerned with confidentiality of transferred media contents such as media authentication and integrity verification. For these reasons, data hiding methods and cryptography are used to protect the contents of digital media. In this paper, an enhanced method of image steganography combined with visual cryptography has been proposed. A secret logo (binary image) of size (128x128) is encrypted by applying (2 out 2 share) visual cryptography on it to generate two secret share. During the embedding process, a cover red, green, and blue (RGB) image of size (512
... Show MoreSpatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati
... Show MoreSix isolates of A. pullulans were collected from many sources including Hibiscus sabdariffa (Roselle), old Roofs of houses and bathroom surface that referred as Ap ros1, Ap or2, 3, 4 and Ap bs5, 6 respectively, all these isolates were identified based on morphological characteristics and nutritional physiology profiles, all were able to utilize various carbon and nitrogen sources such as glucose, xylose, sucrose, maltose, ammonium sulfate, ammonium nitrate and ammonium chloride, also they showed positive test for starch and amylase, while α-cellulose, ethanol, and methanol were could not be ass
... Show MoreIn this research, a number of the western al-Anbar clays (red iron clays, Attapulgite) were modified by treating them thermally with a temperature of 650oC. After that, these clays reflux with sodium hydroxide 5% for 1 hour by using microwave as a power supply. The research included fractionation alqayaira crude oil the fractionation included removing the asphaltene by precipitation from the crude using a simple paraffin solvent (normal hexane) as a non-soluble substance. After that it was filtered using the ash-free filter paper 42, the dissolved part, maltinate, was taken, drying a temperature of 75oC and weight, and to find the percentage of the two parts. Malatine was divided into three main parts (paraf
... Show MoreThe major goal of this research was to use the Euler method to determine the best starting value for eccentricity. Various heights were chosen for satellites that were affected by atmospheric drag. It was explained how to turn the position and velocity components into orbital elements. Also, Euler integration method was explained. The results indicated that the drag is deviated the satellite trajectory from a keplerian orbit. As a result, the Keplerian orbital elements alter throughout time. Additionally, the current analysis showed that Euler method could only be used for low Earth orbits between (100 and 500) km and very small eccentricity (e = 0.001).