Intrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system. A new features selection method is proposed based on DNA encoding and on DNA keys positions. The current system has three phases, the first phase, is called pre-processing phase, which is used to extract the keys and their positions, the second phase is training phase; the main goal of this phase is to select features based on the key positions that gained from pre-processing phase, and the third phase is the testing phase, which classified the network traffic records as either normal or attack by using specific features. The performance is calculated based on the detection rate, false alarm rate, accuracy, and also on the time that include both encoding time and matching time. All these results are based on using two or three keys, and it is evaluated by using two datasets, namely, KDD Cup 99, and NSL-KDD. The achieved detection rate, false alarm rate, accuracy, encoding time, and matching time for all corrected KDD Cup records (311,029 records) by using two and three keys are equal to 96.97, 33.67, 91%, 325, 13 s, and 92.74, 7.41, 92.71%, 325 and 20 s, respectively. The results for detection rate, false alarm rate, accuracy, encoding time, and matching time for all NSL-KDD records (22,544 records) by using two and three keys are equal to 89.34, 28.94, 81.46%, 20, 1 s and 82.93, 11.40, 85.37%, 20 and 1 s, respectively. The proposed system is evaluated and compared with previous systems and these comparisons are done based on encoding time and matching time. The outcomes showed that the detection results of the present system are faster than the previous ones.
This study aims to preparation a standards code for sustainability requirements to contribute in a better understanding to the concept of sustainability assessment systems in the dimensions of Iraqi projects in general and in the high-rise building. Iraq is one of the developing countries that faced significant challenges in sustainability aspects environmental, economic and social, it became necessary to develop an effective sustainability building assessment system in respect of the local context in Iraq. This study presented a proposal for a system of assessing the sustainability requirements of Iraqi high rise buildings (ISHTAR), which has been developed through several integrated
This research presents a model for surveying networks configuration which is designed and called a Computerized Integrated System for Triangulation Network Modeling (CISTNM). It focuses on the strength of figure as a concept then on estimating the relative error (RE) for the computed side (base line) triangulation element. The CISTNM can compute the maximum elevations of the highest
obstacles of the line of sight, the observational signal tower height, the contribution of each triangulation station with their intervisibility test and analysis. The model is characterized by the flexibility to select either a single figure or a combined figures network option. Each option includes three other implicit options such as: triangles, quadri
Abstract
For sparse system identification,recent suggested algorithms are
-norm Least Mean Square (
-LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named
-ZA-LMS,
In this research the performance of 5G mobile system is evaluated through the Ergodic capacity metric. Today, in any wireless communication system, many parameters have a significant role on system performance. Three main parameters are of concern here; the source power, number of antennas, and transmitter-receiver distance. User equipment’s (UEs) with equal and non-equal powers are used to evaluate the system performance in addition to using different antenna techniques to demonstrate the differences between SISO, MIMO, and massive MIMO. Using two mobile stations (MS) with different distances from the base station (BS), resulted in showing how using massive MIMO system will improve the performance than the standar
... Show MoreIn this study, the performance of the adaptive optics (AO) system was analyzed through a numerical computer simulation implemented in MATLAB. Making a phase screen involved turning computer-generated random numbers into two-dimensional arrays of phase values on a sample point grid with matching statistics. Von Karman turbulence was created depending on the power spectral density. Several simulated point spread functions (PSFs) and modulation transfer functions (MTFs) for different values of the Fried coherent diameter (ro) were used to show how rough the atmosphere was. To evaluate the effectiveness of the optical system (telescope), the Strehl ratio (S) was computed. The compensation procedure for an AO syst
... Show MoreResearchers are increasingly using multimodal biometrics to strengthen the security of biometric applications. In this study, a strong multimodal human identification model was developed to address the growing problem of spoofing attacks in biometric security systems. Through the use of metaheuristic optimization methods, such as the Genetic Algorithm(GA), Ant Colony Optimization(ACO), and Particle Swarm Optimization (PSO) for feature selection, this unique model incorporates three biometric modalities: face, iris, and fingerprint. Image pre-processing, feature extraction, critical image feature selection, and multibiometric recognition are the four main steps in the workflow of the system. To determine its performance, the model wa
... Show MoreHydroponics is the cultivation of plants by utilizing water without using soil which emphasizes the fulfillment of the nutritional needs of plants. This research has introduced smart hydroponic system that enables regular monitoring of every aspect to maintain the pH values, water, temperature, and soil. Nevertheless, there is a lack of knowledge that can systematically represent the current research. The proposed study suggests a systematic literature review of smart hydroponics system to overcome this limitation. This systematic literature review will assist practitioners draw on existing literature and propose new solutions based on available knowledge in the smart hydroponic system. The outcomes of this paper can assist future r
... Show MoreThe basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreAdvances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship an
... Show More