Intrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system. A new features selection method is proposed based on DNA encoding and on DNA keys positions. The current system has three phases, the first phase, is called pre-processing phase, which is used to extract the keys and their positions, the second phase is training phase; the main goal of this phase is to select features based on the key positions that gained from pre-processing phase, and the third phase is the testing phase, which classified the network traffic records as either normal or attack by using specific features. The performance is calculated based on the detection rate, false alarm rate, accuracy, and also on the time that include both encoding time and matching time. All these results are based on using two or three keys, and it is evaluated by using two datasets, namely, KDD Cup 99, and NSL-KDD. The achieved detection rate, false alarm rate, accuracy, encoding time, and matching time for all corrected KDD Cup records (311,029 records) by using two and three keys are equal to 96.97, 33.67, 91%, 325, 13 s, and 92.74, 7.41, 92.71%, 325 and 20 s, respectively. The results for detection rate, false alarm rate, accuracy, encoding time, and matching time for all NSL-KDD records (22,544 records) by using two and three keys are equal to 89.34, 28.94, 81.46%, 20, 1 s and 82.93, 11.40, 85.37%, 20 and 1 s, respectively. The proposed system is evaluated and compared with previous systems and these comparisons are done based on encoding time and matching time. The outcomes showed that the detection results of the present system are faster than the previous ones.
In the present study, synthesis of bis Schiff base [I, II] by reaction of one mole of terephthalaldehyde with two mole of 2-amino-5-mercapto-1,3,4-thiadiazole or 4-amino benzene thiol in the ethanol absolute, then compounds [I,II] were reacted with Na2CO3 of distilled H2O, then chloroacetic acid was added to yield compounds [III,IV]. O-chitosan derivatives [V,VI] were synthesized by reaction of chitosan with compounds [III,IV] in acidic media in distilled water according to the steps of Fischer. O–chitosan (grafted chitosan) [V,VI] was blended with synthetic polymer polyvinyl alcohol (PVA) to produce polymers [VII,VIII], then these polymers were blended with nano: Gold or Silver by u
... Show MoreA three-dimensional (3D) model extraction represents the best way to reflect the reality in all details. This explains the trends and tendency of many scientific disciplines towards making measurements, calculations and monitoring in various fields using such model. Although there are many ways to produce the 3D model like as images, integration techniques, and laser scanning, however, the quality of their products is not the same in terms of accuracy and detail. This article aims to assess the 3D point clouds model accuracy results from close range images and laser scan data based on Agi soft photoscan and cloud compare software to determine the compatibility of both datasets for several applications. College of Scien
... Show MoreIn many areas, such as simulation, numerical analysis, computer programming, decision-making, entertainment, and coding, a random number input is required. The pseudo-random number uses its seed value. In this paper, a hybrid method for pseudo number generation is proposed using Linear Feedback Shift Registers (LFSR) and Linear Congruential Generator (LCG). The hybrid method for generating keys is proposed by merging technologies. In each method, a new large in key-space group of numbers were generated separately. Also, a higher level of secrecy is gained such that the internal numbers generated from LFSR are combined with LCG (The adoption of roots in non-linear iteration loops). LCG and LFSR are linear structures and outputs
... Show MoreHomomorphic encryption became popular and powerful cryptographic primitive for various cloud computing applications. In the recent decades several developments has been made. Few schemes based on coding theory have been proposed but none of them support unlimited operations with security. We propose a modified Reed-Muller Code based symmetric key fully homomorphic encryption to improve its security by using message expansion technique. Message expansion with prepended random fixed length string provides one-to-many mapping between message and codeword, thus one-to many mapping between plaintext and ciphertext. The proposed scheme supports both (MOD 2) additive and multiplication operations unlimitedly. We make an effort to prove
... Show Moreplanning is among the most significant in the field of robotics research. As it is linked to finding a safe and efficient route in a cluttered environment for wheeled mobile robots and is considered a significant prerequisite for any such mobile robot project to be a success. This paper proposes the optimal path planning of the wheeled mobile robot with collision avoidance by using an algorithm called grey wolf optimization (GWO) as a method for finding the shortest and safe. The research goals in this study for identify the best path while taking into account the effect of the number of obstacles and design parameters on performance for the algorithm to find the best path. The simulations are run in the MATLAB environment to test the
... Show MoreFluoroscopic images are a field of medical images that depends on the quality of image for correct diagnosis; the main trouble is the de-nosing and how to keep the poise between degradation of noisy image, from one side, and edge and fine details preservation, from the other side, especially when fluoroscopic images contain black and white type noise with high density. The previous filters could usually handle low/medium black and white type noise densities, that expense edge, =fine details preservation and fail with high density of noise that corrupts the images. Therefore, this paper proposed a new Multi-Line algorithm that deals with high-corrupted image with high density of black and white type noise. The experiments achieved i
... Show MorePure and doped TiO 2 with Bi films are obtained by pulse laser deposition technique at RT under vacume 10-3 mbar, and the influence of Bi content on the photocvoltaic properties of TiO 2 hetrojunctions is studied. All the films display photovoltaic in the near visible region. A broad double peaks are observed around λ= 300nm for pure TiO 2 at RT in the spectral response of the photocurrent, which corresponds approximately to the absorption edge and this peak shift to higher wavelength (600 nm) when Bi content increase by 7% then decrease by 9%. The result is confirmed with the decreasing of the energy gap in optical properties. Also, the increasing is due to an increase in the amount of Bi content, and shifted to 400nm when annealed at 523
... Show MoreImage contrast enhancement methods have been a topic of interest in digital image processing for various applications like satellite imaging, recognition, medical imaging, and stereo vision. This paper studies the technique for image enhancement utilizing Adaptive Histogram Equalization and Weighted Gamma Correction to cater radiometric condition and illumination variations of stereo image pairs. In the proposed method, the stereo pair images are segmented together with weighted distribution into sub-histograms supported with Histogram Equalization (HE) mapping or gamma correction and guided filtering. The experimental result shows the experimented techniques outperform compare with the original image in ev
... Show MoreIn this paper, we have employed a computation of three technique to reduce the computational complexity and bit rate for compressed image. These techniques are bit plane coding based on two absolute values, vector quantization VQ technique using Cache codebook and Weber's low condition. The experimental results show that the proposed techniques achieve reduce the storage size of bit plane and low computational complexity.
Image registration plays a significant role in the medical image processing field. This paper proposes a development on the accuracy and performance of the Speeded-Up Robust Surf (SURF) algorithm to create Extended Field of View (EFoV) Ultrasound (US) images through applying different matching measures. These measures include Euclidean distance, cityblock distance, variation, and correlation in the matching stage that was built in the SURF algorithm. The US image registration (fusion) was implemented depending on the control points obtained from the used matching measures. The matched points with higher frequency algorithm were proposed in this work to perform and enhance the EFoV for the US images, since the maximum accurate matching po
... Show More