A novel method for Network Intrusion Detection System (NIDS) has been proposed, based on the concept of how DNA sequence detects disease as both domains have similar conceptual method of detection. Three important steps have been proposed to apply DNA sequence for NIDS: convert the network traffic data into a form of DNA sequence using Cryptography encoding method; discover patterns of Short Tandem Repeats (STR) sequence for each network traffic attack using Teiresias algorithm; and conduct classification process depends upon STR sequence based on Horspool algorithm. 10% KDD Cup 1999 data set is used for training phase. Correct KDD Cup 1999 data set is used for testing phase to evaluate the proposed method. The current experiment results show that the proposed system has obtained good results and these results are equal to 86.36%, 49.69%, and 77.65% for detection rate, false alarm rate and accuracy respectively. These results are considered as a better result when it is compared with the other previous basic algorithms. It is possible to conclude that DNA sequence has potential for NIDS solution and it has potential improvement using a better encoding method.
Because of vulnerable threats and attacks against database during transmission from sender to receiver, which is one of the most global security concerns of network users, a lightweight cryptosystem using Rivest Cipher 4 (RC4) algorithm is proposed. This cryptosystem maintains data privacy by performing encryption of data in cipher form and transfers it over the network and again performing decryption to original data. Hens, ciphers represent encapsulating system for database tables
Elliptic Curve Cryptography (ECC) is one of the public key cryptosystems that works based on the algebraic models in the form of elliptic curves. Usually, in ECC to implement the encryption, the encoding of data must be carried out on the elliptic curve, which seems to be a preprocessing step. Similarly, after the decryption a post processing step must be conducted for mapping or decoding the corresponding data to the exact point on the elliptic curves. The Memory Mapping (MM) and Koblitz Encoding (KE) are the commonly used encoding models. But both encoding models have drawbacks as the MM needs more memory for processing and the KE needs more computational resources. To overcome these issues the proposed enhanced Koblitz encodi
... Show MoreIn this article, a numerical method integrated with statistical data simulation technique is introduced to solve a nonlinear system of ordinary differential equations with multiple random variable coefficients. The utilization of Monte Carlo simulation with central divided difference formula of finite difference (FD) method is repeated n times to simulate values of the variable coefficients as random sampling instead being limited as real values with respect to time. The mean of the n final solutions via this integrated technique, named in short as mean Monte Carlo finite difference (MMCFD) method, represents the final solution of the system. This method is proposed for the first time to calculate the numerical solution obtained fo
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreMost includeding techniques of digital watermark even now working through the direct inclusion in the pixel without taking into account the level of compression (attack) that can go wrong, which makes digital watermark can be discarded easily. In this research, a method was proposed to overcome this problem, which is based on DCT (after image partitioned into non overlapped blocks with size 8×8 pixel), accompanied by a quantization method. The watermark (digital image) is embedded in DCT frequency domain seeking the blocks have highest standard deviation (the checking is only on the AC coefficients) within a predetermined threshold value, then the covered image will compressed (attacked) varying degrees of compression. The suggested met
... Show MoreIn this paper an algorithm for Steganography using DCT for cover image and DWT for hidden image with an embedding order key is proposed. For more security and complexity the cover image convert from RGB to YIQ, Y plane is used and divided into four equally parts and then converted to DCT domain. The four coefficient of the DWT of the hidden image are embedded into each part of cover DCT, the embedding order based on the order key of which is stored with cover in a database table in both the sender and receiver sender. Experimental results show that the proposed algorithm gets successful hiding information into the cover image. We use Microsoft Office Access 2003 database as DBMS, the hiding, extracting algo
... Show MoreThe steganography (text in image hiding) methods still considered important issues to the researchers at the present time. The steganography methods were varied in its hiding styles from a simple to complex techniques that are resistant to potential attacks. In current research the attack on the host's secret text problem didn’t considered, but an improved text hiding within the image have highly confidential was proposed and implemented companied with a strong password method, so as to ensure no change will be made in the pixel values of the host image after text hiding. The phrase “highly confidential” denoted to the low suspicious it has been performed may be found in the covered image. The Experimental results show that the covere
... Show MoreEpithelial ovarian cancer is the leading cause of cancer deaths from gynecological malignancies. Angiogenesis is considered essential for tumor growth and the development of metastases. VEGF and IL?8 are potent angiostimulatory molecules and their expression has been demonstrated in many solid tumors, including ovarian cancer.VEGF and IL-8 concentrations were measured by ELISA test (HumanVEGF,IL-8). Bioassay ELISA/ US Biological / USA).The median VEGF and IL-8 levels were significantly higher in the sera of ovarian cancer patients than in those with benign tumors and in healthy controls.Pretreatment VEGF and IL-8 serum levels might be regarded as an additional tool in the differentiation of ovarian tumors.
Dust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreIn this research a proposed technique is used to enhance the frame difference technique performance for extracting moving objects in video file. One of the most effective factors in performance dropping is noise existence, which may cause incorrect moving objects identification. Therefore it was necessary to find a way to diminish this noise effect. Traditional Average and Median spatial filters can be used to handle such situations. But here in this work the focus is on utilizing spectral domain through using Fourier and Wavelet transformations in order to decrease this noise effect. Experiments and statistical features (Entropy, Standard deviation) proved that these transformations can stand to overcome such problems in an elegant way.
... Show More