Recent research has shown that a Deoxyribonucleic Acid (DNA) has ability to be used to discover diseases in human body as its function can be used for an intrusion-detection system (IDS) to detect attacks against computer system and networks traffics. Three main factor influenced the accuracy of IDS based on DNA sequence, which is DNA encoding method, STR keys and classification method to classify the correctness of proposed method. The pioneer idea on attempt a DNA sequence for intrusion detection system is using a normal signature sequence with alignment threshold value, later used DNA encoding based cryptography, however the detection rate result is very low. Since the network traffic consists of 41 attributes, therefore we proposed the most possible less character number (same DNA length) which is four-character DNA encoding that represented all 41 attributes known as DEM4all. The experiments conducted using standard data KDDCup 99 and NSL-KDD. Teiresias algorithm is used to extract Short Tandem Repeat (STR), which includes both keys and their positions in the network traffic, while Brute-force algorithm is used as a classification process to determine whether the network traffic is attack or normal. Experiment run 30 times for each DNA encoding method. The experiment result shows that proposed method has performed better accuracy (15% improved) compare with previous and state of the art DNA algorithms. With such results it can be concluded that the proposed DEM4all DNA encoding method is a good method that can used for IDS. More complex encoding can be proposed that able reducing less number of DNA sequence can possible produce more detection accuracy.
The possibility of implementing smart mobility in the traditional city: Studying the possibility of establishing an intelligent transportation system in the city center of Kadhimiya
In order to select the optimal tracking of fast time variation of multipath fast time variation Rayleigh fading channel, this paper focuses on the recursive least-squares (RLS) and Extended recursive least-squares (E-RLS) algorithms and reaches the conclusion that E-RLS is more feasible according to the comparison output of the simulation program from tracking performance and mean square error over five fast time variation of Rayleigh fading channels and more than one time (send/receive) reach to 100 times to make sure from efficiency of these algorithms.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In this paper, an exact stiffness matrix and fixed-end load vector for nonprismatic beams having parabolic varying depth are derived. The principle of strain energy is used in the derivation of the stiffness matrix.
The effect of both shear deformation and the coupling between axial force and the bending moment are considered in the derivation of stiffness matrix. The fixed-end load vector for elements under uniformly distributed or concentrated loads is also derived. The correctness of the derived matrices is verified by numerical examples. It is found that the coupling effect between axial force and bending moment is significant for elements having axial end restraint. It was found that the decrease in bending moment was
in the
Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreABSTRUCT
This research aims at examining the expected gap between the fact of planning and controlling process of production at the State Company for Electric Industries and implementation of material requirements planning system in fuzzy environment. Developing solutions to bridge the gap is required to provide specific mechanisms subject to the logic of fuzzy rules that will keep pace with demand for increased accuracy and reduced waiting times depending on demand forecast, investment in inventory to reduce costs to a minimum.
The proposed solutions for overcoming the research problem has required some questions reflecting the problem with its multiple dimensions, which ar
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show More