This work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it is obvious that the number of moments selected by the SP should exceed 30% of the overall EEG samples for accuracy to be over 90%.
The lossy-FDNR based aclive fil ter has an important property among many design realizations. 'This includes a significant reduction in component count particularly in the number of OP-AMP which consumes power. However the· problem of this type is the large component spreads which affect the fdter performance.
In this paper Genetic Algorithm is applied to minimize the component spread (capacitance and resistance p,read). The minimization of these spreads allow the fil
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreThe confirming of security and confidentiality of multimedia data is a serious challenge through the growing dependence on digital communication. This paper offers a new image cryptography based on the Chebyshev chaos polynomials map, via employing the randomness characteristic of chaos concept to improve security. The suggested method includes block shuffling, dynamic offset chaos key production, inter-layer XOR, and block 90 degree rotations to disorder the correlations intrinsic in image. The method is aimed for efficiency and scalability, accomplishing complexity order for n-pixels over specific cipher rounds. The experiment outcomes depict great resistant to cryptanalysis attacks, containing statistical, differential and brut
... Show More<p>In this paper, a simple color image compression system has been proposed using image signal decomposition. Where, the RGB image color band is converted to the less correlated YUV color model and the pixel value (magnitude) in each band is decomposed into 2-values; most and least significant. According to the importance of the most significant value (MSV) that influenced by any simply modification happened, an adaptive lossless image compression system is proposed using bit plane (BP) slicing, delta pulse code modulation (Delta PCM), adaptive quadtree (QT) partitioning followed by an adaptive shift encoder. On the other hand, a lossy compression system is introduced to handle the least significant value (LSV), it is based on
... Show MoreFace recognition is required in various applications, and major progress has been witnessed in this area. Many face recognition algorithms have been proposed thus far; however, achieving high recognition accuracy and low execution time remains a challenge. In this work, a new scheme for face recognition is presented using hybrid orthogonal polynomials to extract features. The embedded image kernel technique is used to decrease the complexity of feature extraction, then a support vector machine is adopted to classify these features. Moreover, a fast-overlapping block processing algorithm for feature extraction is used to reduce the computation time. Extensive evaluation of the proposed method was carried out on two different face ima
... Show MoreRecent advances in wireless communication systems have made use of OFDM technique to achieve high data rate transmission. The sensitivity to frequency offset between the carrier frequencies of the transmitter and the receiver is one of the major problems in OFDM systems. This frequency offset introduces inter-carrier interference in the OFDM symbol and then the BER performance reduced. In this paper a Multi-Orthogonal-Band MOB-OFDM system based on the Discrete Hartley Transform (DHT) is proposed to improve the BER performance. The OFDM spectrum is divided into equal sub-bands and the data is divided between these bands to form a local OFDM symbol in each sub-band using DHT. The global OFDM symbol is formed from all sub-bands together using
... Show MoreThe purpose of the current investigation is to distinguish between working memory ( ) in five patients with vascular dementia ( ), fifteen post-stroke patients with mild cognitive impairment ( ), and fifteen healthy control individuals ( ) based on background electroencephalography (EEG) activity. The elimination of EEG artifacts using wavelet (WT) pre-processing denoising is demonstrated in this study. In the current study, spectral entropy ( ), permutation entropy ( ), and approximation entropy ( ) were all explored. To improve the classification using the k-nearest neighbors ( NN) classifier scheme, a comparative study of using fuzzy neighbourhood preserving analysis with -decomposition ( ) as a dimensionality reduction technique an
... Show More