Intrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system. A new features selection method is proposed based on DNA encoding and on DNA keys positions. The current system has three phases, the first phase, is called pre-processing phase, which is used to extract the keys and their positions, the second phase is training phase; the main goal of this phase is to select features based on the key positions that gained from pre-processing phase, and the third phase is the testing phase, which classified the network traffic records as either normal or attack by using specific features. The performance is calculated based on the detection rate, false alarm rate, accuracy, and also on the time that include both encoding time and matching time. All these results are based on using two or three keys, and it is evaluated by using two datasets, namely, KDD Cup 99, and NSL-KDD. The achieved detection rate, false alarm rate, accuracy, encoding time, and matching time for all corrected KDD Cup records (311,029 records) by using two and three keys are equal to 96.97, 33.67, 91%, 325, 13 s, and 92.74, 7.41, 92.71%, 325 and 20 s, respectively. The results for detection rate, false alarm rate, accuracy, encoding time, and matching time for all NSL-KDD records (22,544 records) by using two and three keys are equal to 89.34, 28.94, 81.46%, 20, 1 s and 82.93, 11.40, 85.37%, 20 and 1 s, respectively. The proposed system is evaluated and compared with previous systems and these comparisons are done based on encoding time and matching time. The outcomes showed that the detection results of the present system are faster than the previous ones.
In this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreThis paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
The growing interest in the use of chaotic techniques for enabling secure communication in recent years has been motivated by the emergence of a number of wireless services which require the service provider to provide low bit error rates (BER) along with information security. This paper investigates the feasibility of using chaotic communications over Multiple-Input-Multiple-Output (MIMO) channels. While the use of Chaotic maps can enhance security, it is seen that the overall BER performance gets degraded when compared to conventional communication schemes. In order to overcome this limitation, we have proposed the use of a combination of Chaotic modulation and Alamouti Space Time Block Code. The performance of Chaos Shift Keying (CSK) wi
... Show MoreThere has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide very low bit error rates (BER) along with information security. This paper investigates the feasibility of using chaotic communications over Multiple-Input Multiple-Output (MIMO) channels by combining chaos modulation with a suitable Space Time Block Code (STBC). It is well known that the use of Chaotic Modulation techniques can enhance communication security. However, the performance of systems using Chaos modulation has been observed to be inferior in BER performance as compared to conventional communication
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreGraphite Coated Electrodes (GCE) based on molecularly imprinted polymers were fabricated for the selective potentiometric determination of Risperidone (Ris). The molecularly imprinted (MIP) and nonimprinted (NIP) polymers were synthesized by bulk polymerization using (Ris.) as a template, acrylic acid (AA) and acrylamide (AAm) as monomers, ethylene glycol dimethacrylate (EGDMA) as a cross-linker and benzoyl peroxide (BPO) as an initiator. The imprinted membranes and the non-imprinted membranes were prepared using dioctyl phthalate (DOP) and Dibutylphthalate (DBP) as plasticizers in PVC matrix. The membranes were coated on graphite electrodes. The MIP electrodes using
... Show MoreThe choice of binary Pseudonoise (PN) sequences with specific properties, having long period high complexity, randomness, minimum cross and auto- correlation which are essential for some communication systems. In this research a nonlinear PN generator is introduced . It consists of a combination of basic components like Linear Feedback Shift Register (LFSR), ?-element which is a type of RxR crossbar switches. The period and complexity of a sequence which are generated by the proposed generator are computed and the randomness properties of these sequences are measured by well-known randomness tests.
Social media and networks rely heavily on images. Those images should be distributed in a private manner. Image encryption is therefore one of the most crucial components of cyber security. In the present study, an effective image encryption technique is developed that combines the Rabbit Algorithm, a simple algorithm, with the Attractor of Aizawa, a chaotic map. The lightweight encryption algorithm (Rabbit Algorithm), which is a 3D dynamic system, is made more secure by the Attractor of Aizawa. The process separates color images into blocks by first dividing them into bands of red, green, and blue (RGB). The presented approach generates multiple keys, or sequences, based on the initial parameters and conditions, which are
... Show More