This paper proposes a better solution for EEG-based brain language signals classification, it is using machine learning and optimization algorithms. This project aims to replace the brain signal classification for language processing tasks by achieving the higher accuracy and speed process. Features extraction is performed using a modified Discrete Wavelet Transform (DWT) in this study which increases the capability of capturing signal characteristics appropriately by decomposing EEG signals into significant frequency components. A Gray Wolf Optimization (GWO) algorithm method is applied to improve the results and select the optimal features which achieves more accurate results by selecting impactful features with maximum relevance while minimizing redundancy. This optimization process improves the performance of the classification model in general. In case of classification, the Support Vector Machine (SVM) and Neural Network (NN) hybrid model is presented. This combines an SVM classifier's capacity to manage functions in high dimensional space, as well as a neural network capacity to learn non-linearly with its feature (pattern learning). The model was trained and tested on an EEG dataset and performed a classification accuracy of 97%, indicating the robustness and efficacy of our method. The results indicate that this improved classifier is able to be used in brain–computer interface systems and neurologic evaluations. The combination of machine learning and optimization techniques has established this paradigm as a highly effective way to pursue further research in EEG signal processing for brain language recognition.
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
The research aims to identify the effect of jigsaw strategy in learning achievement and engaging for the third grade intermediate students in chemistry. The research sample consisted of (61) students distributed in two experimental and control groups. The research tools consisted in the achievement test and the measure of engaging learning. The results showed that there are statistically significant differences at the level of (α = 0.05) between the experimental group and the control group in both the achievement test and the measure of learning involvement for the benefit of the experimental group. In this light, the researcher recommended the use of jigsaw strategy for teaching the subject matter. Lamia because of its impact in raising
... Show MoreAbstract
Objective(s): To evaluate blended learning in nursing education at the Middle Region in Iraq.
Methodology: A descriptive study, using evaluation approach, is conducted to evaluate blended learning in nursing education in Middle Region in Iraq from September 26th, 2021 to March 22nd, 2022. The study is carried out at two Colleges of Nursing at the University of Baghdad and University of Tikrit in Iraq. A convenient, non-probability, sample of (60) undergraduate nursing students is selected. The sample is comprised of (30) student from each college of nursing, Self-report questionnaire is constructed from the literature, for e
... Show MoreRecords of two regionalized variables were processed for each of porosity and permeability of reservoir rocks in Zubair Formation (Zb-109) south Iraq as an indication of the most important reservoir property which is the homogeneity,considering their important results in criterion most needed for primary and enhanced oil reservoirs.The results of dispersion treatment,the statistical incorporeal indications,boxes plots,rhombus style and tangents angles of intersected circles indicated by confidence interval of porosity and permeability data, have shown that the reservoir rocks of Zubair units (LS),(1L) and (DJ) have reservoir properties of high quality,in contrast to that of Zubair units (MS) and (AB)which have reservoir properties of less q
... Show MoreThe Sebkha is considered the evaporative geomorphological features, where climate plays an active role. It forms part of the surface features in Mesopotamia plain of Iraqi, which is the most fertile lands, and because of complimentary natural and human factors turned most of the arable land to the territory of Sebkha lands. The use satellite image (Raw Data), Landsat 30M Mss for the year 1976 Landsat 7 ETM, and the Landsat 8 for year 2013 (LDCM) for the summer Landsat Data Continuity Mission and perform geometric correction, enhancements, and Subset image And a visual analysis Space visuals based on the analysis of spectral fingerprints earth's This study has shown that the best in the discrimination of Sebkha Remote sensing techniques a
... Show MoreA Strength Pareto Evolutionary Algorithm 2 (SPEA 2) approach for solving the multi-objective Environmental / Economic Power Dispatch (EEPD) problem is presented in this paper. In the past fuel cost consumption minimization was the aim (a single objective function) of economic power dispatch problem. Since the clean air act amendments have been applied to reduce SO2 and NOX emissions from power plants, the utilities change their strategies in order to reduce pollution and atmospheric emission as well, adding emission minimization as other objective function made economic power dispatch (EPD) a multi-objective problem having conflicting objectives. SPEA2 is the improved version of SPEA with better fitness assignment, density estimation, an
... Show MoreThe aim of this paper is to design a PID controller based on an on-line tuning bat optimization algorithm for the step-down DC/DC buck converter system which is used in the battery operation of the mobile applications. In this paper, the bat optimization algorithm has been utilized to obtain the optimal parameters of the PID controller as a simple and fast on-line tuning technique to get the best control action for the system. The simulation results using (Matlab Package) show the robustness and the effectiveness of the proposed control system in terms of obtaining a suitable voltage control action as a smooth and unsaturated state of the buck converter input voltage of ( ) volt that will stabilize the buck converter sys
... Show MoreThis paper deals with proposing new lifting scheme (HYBRID Algorithm) that is capable of preventing images and documents which are fraud through decomposing there in to the real colors value arrays (red, blue and green) to create retrieval keys for its properties and store it in the database and then check the document originality by retrieve the query image or document through the decomposition described above and compare the predicted color values (retrieval keys) of the query document with those stored in the database. The proposed algorithm has been developed from the two known lifting schemes (Haar and D4) by merging them to find out HYBRID lifting scheme. The validity and accuracy of the proposed algorithm have been ev
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
The performance quality and searching speed of Block Matching (BM) algorithm are affected by shapes and sizes of the search patterns used in the algorithm. In this paper, Kite Cross Hexagonal Search (KCHS) is proposed. This algorithm uses different search patterns (kite, cross, and hexagonal) to search for the best Motion Vector (MV). In first step, KCHS uses cross search pattern. In second step, it uses one of kite search patterns (up, down, left, or right depending on the first step). In subsequent steps, it uses large/small Hexagonal Search (HS) patterns. This new algorithm is compared with several known fast block matching algorithms. Comparisons are based on search points and Peak Signal to Noise Ratio (PSNR). According to resul
... Show More