<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. This study presents a hybrid greedy hill climbing algorithm (HGHC) that ensures both effectiveness and near-optimal results for generating a small number of test data. To make certain that the suggested HGHC outperforms the most used techniques in terms of test size. It is compared to others in order to determine its effectiveness. In contrast to recent practices utilized for the production of covering arrays (CAs) and mixed covering arrays (MCAs), this hybrid strategy is superior since allowing it to provide the utmost outcome while reducing the size and limit the loss of unique pairings in the CA/MCA generation.</p>
The futuristic age requires progress in handwork or even sub-machine dependency and Brain-Computer Interface (BCI) provides the necessary BCI procession. As the article suggests, it is a pathway between the signals created by a human brain thinking and the computer, which can translate the signal transmitted into action. BCI-processed brain activity is typically measured using EEG. Throughout this article, further intend to provide an available and up-to-date review of EEG-based BCI, concentrating on its technical aspects. In specific, we present several essential neuroscience backgrounds that describe well how to build an EEG-based BCI, including evaluating which signal processing, software, and hardware techniques to use. Individu
... Show MoreCurrently, with the huge increase in modern communication and network applications, the speed of transformation and storing data in compact forms are pressing issues. Daily an enormous amount of images are stored and shared among people every moment, especially in the social media realm, but unfortunately, even with these marvelous applications, the limited size of sent data is still the main restriction's, where essentially all these applications utilized the well-known Joint Photographic Experts Group (JPEG) standard techniques, in the same way, the need for construction of universally accepted standard compression systems urgently required to play a key role in the immense revolution. This review is concerned with Different
... Show MoreTo ensure that a software/hardware product is of sufficient quality and functionality, it is essential to conduct thorough testing and evaluations of the numerous individual software components that make up the application. Many different approaches exist for testing software, including combinatorial testing and covering arrays. Because of the difficulty of dealing with difficulties like a two-way combinatorial explosion, this brings up yet another problem: time. Using client-server architectures, this research introduces a parallel implementation of the TWGH algorithm. Many studies have been conducted to demonstrate the efficiency of this technique. The findings of this experiment were used to determine the increase in speed and co
... Show MoreMonaural source separation is a challenging issue due to the fact that there is only a single channel available; however, there is an unlimited range of possible solutions. In this paper, a monaural source separation model based hybrid deep learning model, which consists of convolution neural network (CNN), dense neural network (DNN) and recurrent neural network (RNN), will be presented. A trial and error method will be used to optimize the number of layers in the proposed model. Moreover, the effects of the learning rate, optimization algorithms, and the number of epochs on the separation performance will be explored. Our model was evaluated using the MIR-1K dataset for singing voice separation. Moreover, the proposed approach achi
... Show MoreSix proposed simply supported high strength-steel fiber reinforced concrete (HS-SFRC) beams reinforced with FRP (fiber reinforced polymer) rebars were numerically tested by finite element method using ABAQUS software to investigate their behavior under the flexural failure. The beams were divided into two groups depending on their cross sectional shape. Group A consisted of four trapezoidal beams with dimensions of (height 200 mm, top width 250 mm, and bottom width 125 mm), while group B consisted of two rectangular beams with dimensions of (125 ×200) mm. All specimens have same total length of 1500 mm, and they were also considered to be made of same high strength concrete designed material with 1% volume fraction of steel fiber.
... Show MoreThe rise in the general level of prices in Iraq makes the local commodity less able to compete with other commodities, which leads to an increase in the amount of imports and a decrease in the amount of exports, since it raises demand for foreign currencies while decreasing demand for the local currency, which leads to a decrease in the exchange rate of the local currency in exchange for an increase in the exchange rate of currencies. This is one of the most important factors affecting the determination of the exchange rate and its fluctuations. This research deals with the currency of the European Euro and its impact against the Iraqi dinar. To make an accurate prediction for any process, modern methods can be used through which
... Show MoreAbstract
For sparse system identification,recent suggested algorithms are
-norm Least Mean Square (
-LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named
-ZA-LMS,