This work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it is obvious that the number of moments selected by the SP should exceed 30% of the overall EEG samples for accuracy to be over 90%.
This paper presents an improved technique on Ant Colony Optimization (ACO) algorithm. The procedure is applied on Single Machine with Infinite Bus (SMIB) system with power system stabilizer (PSS) at three different loading regimes. The simulations are made by using MATLAB software. The results show that by using Improved Ant Colony Optimization (IACO) the system will give better performance with less number of iterations as it compared with a previous modification on ACO. In addition, the probability of selecting the arc depends on the best ant performance and the evaporation rate.
In this paper, RBF-based multistage auto-encoders are used to detect IDS attacks. RBF has numerous applications in various actual life settings. The planned technique involves a two-part multistage auto-encoder and RBF. The multistage auto-encoder is applied to select top and sensitive features from input data. The selected features from the multistage auto-encoder is wired as input to the RBF and the RBF is trained to categorize the input data into two labels: attack or no attack. The experiment was realized using MATLAB2018 on a dataset comprising 175,341 case, each of which involves 42 features and is authenticated using 82,332 case. The developed approach here has been applied for the first time, to the knowledge of the authors, to dete
... Show MorePolyaniline nanofibers (PAni-NFs) have been synthesized under various concentrations (0.12, 0.16, and 0.2 g/l) of aniline and different times (2h and 3 h) by hydrothermal method at 90°C. Was conducted with the use of X-ray diffraction (XRD), Fourier Transform Infrared spectra (FTIR), Ultraviolet-Visible (UV-VIS) absorption spectra, Thermogravimetric Analysis (TGA), and Field Emission-Scanning Electron Microscopy (FE-SEM). The X-ray diffraction patterns revealed the amorphous nature of all the produced samples. FE-SEM demonstrated that Polyaniline has a nanofiber-like structure. The observed typical peaks of PAni were (1580, 1300-1240, and 821 cm-1 ), analyzed by the chemical bonding of the formed PAni through FTIR spectroscopy. Also, tests
... Show MoreImaging by Ultrasound (US) is an accurate and useful modality for the assessment of gestational age (GA), estimation fetal weight, and monitoring the fetal growth during pregnancy, is a routine part of prenatal care, and that can greatly impact obstetric management. Estimation of GA is important in obstetric care, making appropriate management decisions requires accurate appraisal of GA. Accurate GA estimation may assist obstetricians in appropriately counseling women who are at risk of a preterm delivery about likely neonatal outcomes, and it is essential in the evaluation of the fetal growth and detection of intrauterine growth restriction. There are many formulas are used to estimate fetal GA in the world, but it's not specify fo
... Show More
The reliability of the stress-strength model attracted many statisticians for several years owing to its applicability in different and diverse parts such as engineering, quality control, and economics. In this paper, the system reliability estimation in the stress-strength model containing Kth parallel components will be offered by four types of shrinkage methods: constant Shrinkage Estimation Method, Shrinkage Function Estimator, Modified Thompson Type Shrinkage Estimator, Squared Shrinkage Estimator. The Monte Carlo simulation study is compared among proposed estimators using the mean squared error. The result analyses of the shrinkage estimation methods showed that the shrinkage functions estimator was the best since
... Show MoreThe subject of an valuation of quality of construction projects is one of the topics which it becomes necessary of the absence of the quantity standards in measuring the control works and the quality valuation standards in constructional projects. In the time being it depends on the experience of the workers which leads to an apparent differences in the valuation.
The idea of this research came to put the standards to evaluate the quality of the projects in a special system depending on quantity scale nor quality specifying in order to prepare an expert system “ Crystal “ to apply this special system to able the engineers to valuate the quality of their projects easily and in more accurate ways.
The introduction of concrete damage plasticity material models has significantly improved the accuracy with which the concrete structural elements can be predicted in terms of their structural response. Research into this method's accuracy in analyzing complex concrete forms has been limited. A damage model combined with a plasticity model, based on continuum damage mechanics, is recommended for effectively predicting and simulating concrete behaviour. The damage parameters, such as compressive and tensile damages, can be defined to simulate concrete behavior in a damaged-plasticity model accurately. This research aims to propose an analytical model for assessing concrete compressive damage based on stiffness deterioration. The prop
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show More