This work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it is obvious that the number of moments selected by the SP should exceed 30% of the overall EEG samples for accuracy to be over 90%.
In this paper, RBF-based multistage auto-encoders are used to detect IDS attacks. RBF has numerous applications in various actual life settings. The planned technique involves a two-part multistage auto-encoder and RBF. The multistage auto-encoder is applied to select top and sensitive features from input data. The selected features from the multistage auto-encoder is wired as input to the RBF and the RBF is trained to categorize the input data into two labels: attack or no attack. The experiment was realized using MATLAB2018 on a dataset comprising 175,341 case, each of which involves 42 features and is authenticated using 82,332 case. The developed approach here has been applied for the first time, to the knowledge of the authors, to dete
... Show More
The reliability of the stress-strength model attracted many statisticians for several years owing to its applicability in different and diverse parts such as engineering, quality control, and economics. In this paper, the system reliability estimation in the stress-strength model containing Kth parallel components will be offered by four types of shrinkage methods: constant Shrinkage Estimation Method, Shrinkage Function Estimator, Modified Thompson Type Shrinkage Estimator, Squared Shrinkage Estimator. The Monte Carlo simulation study is compared among proposed estimators using the mean squared error. The result analyses of the shrinkage estimation methods showed that the shrinkage functions estimator was the best since
... Show MoreThere has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide very low bit error rates (BER) along with information security. This paper investigates the feasibility of using chaotic communications over Multiple-Input Multiple-Output (MIMO) channels by combining chaos modulation with a suitable Space Time Block Code (STBC). It is well known that the use of Chaotic Modulation techniques can enhance communication security. However, the performance of systems using Chaos modulation has been observed to be inferior in BER performance as compared to conventional communication
... Show MoreThe growing interest in the use of chaotic techniques for enabling secure communication in recent years has been motivated by the emergence of a number of wireless services which require the service provider to provide low bit error rates (BER) along with information security. This paper investigates the feasibility of using chaotic communications over Multiple-Input-Multiple-Output (MIMO) channels. While the use of Chaotic maps can enhance security, it is seen that the overall BER performance gets degraded when compared to conventional communication schemes. In order to overcome this limitation, we have proposed the use of a combination of Chaotic modulation and Alamouti Space Time Block Code. The performance of Chaos Shift Keying (CSK) wi
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreThis study came to discuss the subject of industries dependent on petrochemical industries in Iraq (plastic as a model) during the period 2005–2020, and the study concluded that the plastic industries contribute to areas of advancement and progress and opportunities to deal efficiently with the challenges posed by the new variables, the most important of which is the information revolution. communications and trade liberalization, and this is what contributes to the competitiveness of these industries. And because the petrochemical industry in Iraq has an active role in establishing plastic industrial clusters and clusters of micro, small, and medium industries by providing the necessary feedstock for these industries in various fields
... Show MoreThe subject of an valuation of quality of construction projects is one of the topics which it becomes necessary of the absence of the quantity standards in measuring the control works and the quality valuation standards in constructional projects. In the time being it depends on the experience of the workers which leads to an apparent differences in the valuation.
The idea of this research came to put the standards to evaluate the quality of the projects in a special system depending on quantity scale nor quality specifying in order to prepare an expert system “ Crystal “ to apply this special system to able the engineers to valuate the quality of their projects easily and in more accurate ways.
Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreThis paper presents a parametric audio compression scheme intended for scalable audio coding applications, and is particularly well suited for operation at low rates, in the vicinity of 5 to 32 Kbps. The model consists of two complementary components: Sines plus Noise (SN). The principal component of the system is an. overlap-add analysis-by-synthesis sinusoidal model based on conjugate matching pursuits. Perceptual information about human hearing is explicitly included into the model by psychoacoustically weighting the pursuit metric. Once analyzed, SN parameters are efficiently quantized and coded. Our informal listening tests demonstrated that our coder gave competitive performance to the-state-of-the- art HelixTM Producer Plus 9 from
... Show More