Some of the main challenges in developing an effective network-based intrusion detection system (IDS) include analyzing large network traffic volumes and realizing the decision boundaries between normal and abnormal behaviors. Deploying feature selection together with efficient classifiers in the detection system can overcome these problems. Feature selection finds the most relevant features, thus reduces the dimensionality and complexity to analyze the network traffic. Moreover, using the most relevant features to build the predictive model, reduces the complexity of the developed model, thus reducing the building classifier model time and consequently improves the detection performance. In this study, two different sets of selected features have been adopted to train four machine-learning based classifiers. The two sets of selected features are based on Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) approach respectively. These evolutionary-based algorithms are known to be effective in solving optimization problems. The classifiers used in this study are Naïve Bayes, k-Nearest Neighbor, Decision Tree and Support Vector Machine that have been trained and tested using the NSL-KDD dataset. The performance of the abovementioned classifiers using different features values was evaluated. The experimental results indicate that the detection accuracy improves by approximately 1.55% when implemented using the PSO-based selected features than that of using GA-based selected features. The Decision Tree classifier that was trained with PSO-based selected features outperformed other classifiers with accuracy, precision, recall, and f-score result of 99.38%, 99.36%, 99.32%, and 99.34% respectively. The results show that using optimal features coupling with a good classifier in a detection system able to reduce the classifier model building time, reduce the computational burden to analyze data, and consequently attain high detection rate.
New microphotometer was constructed in our Laboratory Which deals with the determination of Molybdenum (VI) through its Catalysis effect on Hydrogen peroxide and potasum iodide Reaction in acid medium H2SO4 0.01 mM. Linearity of 97.3% for the range 5- 100 ppm. The repeatability of result was better than 0.8 % 0.5 ppm was obtanined as L.U. (The method applied for the determination of Molybdenum (VI) in medicinal Sample (centrum). The determination was compared well with the developed method the conventional method.
Evaluation was carried out on the existing furrow irrigation system located in an open agricultural field within Hor Rajabh Township, south of Baghdad, Iraq (latitude: 33°09’ N, longitude: 44°24’ E). Two plots were chosen for comparison: treatment plot T1, which used subsurface water retention technology (SWRT) with a furrow irrigation system. While the treatment plot T2 was done by using a furrow irrigation procedure without SWRT. A comparison between the two treatment plots was carried out to study the efficiency of the applied water on crop yield. In terms of agricultural productivity and water use efficiency, plot T1 outperformed plot T2, according to the study’s final fin
Imitation learning is an effective method for training an autonomous agent to accomplish a task by imitating expert behaviors in their demonstrations. However, traditional imitation learning methods require a large number of expert demonstrations in order to learn a complex behavior. Such a disadvantage has limited the potential of imitation learning in complex tasks where the expert demonstrations are not sufficient. In order to address the problem, we propose a Generative Adversarial Network-based model which is designed to learn optimal policies using only a single demonstration. The proposed model is evaluated on two simulated tasks in comparison with other methods. The results show that our proposed model is capable of completing co
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreIn this work, the fractional damped Burger's equation (FDBE) formula = 0,
Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreLately great interests have emerged to find educational alternatives to teach and improve motor skills according to modern educational methods that take into account individual differences and speed in learning for the learner through individual learning that the learner adopts by teaching himself by passing through various educational situations to acquire skills and information in the way he is The learner is the focus of the educational process and among these alternatives the interactive video, the researchers noted through the educational training units at the Model Squash School of the Central Union, and that most of the methods and methods used in learning basic skills take a lot of time in the educational program and do not involve
... Show MoreThis study came for the reason that some project administrations still do not follow the appropriate scientific methods that enable them to perform their work in a manner that achieves the goals for which those projects arise, in addition to exceeding the planned times and costs, so this study aims to apply the methods of network diagrams in Planning, scheduling and monitoring the project of constructing an Alzeuot intersection bridge in the city of Ramadi, as the research sample, being one of the strategic projects that are being implemented in the city of Ramadi, as well as being one of the projects that faced during its implementation Several of problems, the project problem was studied according to scientific methods through the applica
... Show MoreFive sites were chosen to the north of Babil Governorate in order to identify the limnological features and the impact of the Hindiya Dam during 2019. Site2 was located near the dam to reflect the ecological features of this site, whereas other sites, S1 was located at the upstream of the dam as a control site. Moreover, the two other sites S3 and S4 were located down the dam. The results of the study showed a close correlation between air and water temperature at all sites. Also there were significant differences in average of thirteen out of eighteen water parameters.Water temperature, total alkalinity, bicarbonate, DO, POS, TH and Mg+2 ions decreased from 22.76˚C, 203.33 mg/L,
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.