The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
Intrusion detection systems (IDS) are useful tools that help security administrators in the developing task to secure the network and alert in any possible harmful event. IDS can be classified either as misuse or anomaly, depending on the detection methodology. Where Misuse IDS can recognize the known attack based on their signatures, the main disadvantage of these systems is that they cannot detect new attacks. At the same time, the anomaly IDS depends on normal behaviour, where the main advantage of this system is its ability to discover new attacks. On the other hand, the main drawback of anomaly IDS is high false alarm rate results. Therefore, a hybrid IDS is a combination of misuse and anomaly and acts as a solution to overcome the dis
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
This paper presents a cognition path planning with control algorithm design for a nonholonomic wheeled mobile robot based on Particle Swarm Optimization (PSO) algorithm. The aim of this work is to propose the circular roadmap (CRM) method to plan and generate optimal path with free navigation as well as to propose a nonlinear MIMO-PID-MENN controller in order to track the wheeled mobile robot on the reference path. The PSO is used to find an online tune the control parameters of the proposed controller to get the best torques actions for the wheeled mobile robot. The numerical simulation results based on the Matlab package show that the proposed structure has a precise and highly accurate distance of the generated refere
... Show MoreTench is a cyprinid fish that has undergone human-induced translocations. The natural populations of the species are on the decline due to habitat loss and spawning grounds degradation. The genetic diversity of seven natural populations was investigated to establish the genetic knowledge base for successful conservation efforts and for selective breeding. Twelve microsatellite markers, the sequencing of a 615 bp section of mtDNA (Cytb) and PCR-RFLP analysis of two nuclear markers (Act) and (RpS7) were used to analyze the genetic variation and structure among 175 individuals. All microsatellite loci were found to have moderate levels of polymorphism. The pairwise Fst values between population pairings were moderate; the populations w
... Show MoreThe research seeks to clarify the problems related to the aspects of the financial and accounting process resulting from entering into contractual arrangements with a period of more than 20 years, among which is the research problem represented by the lack of clarity of the foundations and procedures for the recognition of oil costs and additional costs borne by foreign invested companies, which led to a weakening of their credibility and reflection. Negatively "on the measurement and accounting disclosure of financial reports prepared by oil companies, and the research aims to lay down sound procedures for measuring and classifying oil costs and additional costs paid to foreign companies, and recognizing and recording them in th
... Show MoreThe article is devoted to the Russian-Arabic translation, a particular theory of which has not been developed in domestic translation studies to the extent that the mechanisms of translation from and into European languages are described. In this regard, as well as with the growing volumes of Russian-Arabic translation, the issues of this private theory of translation require significant additions and new approaches. The authors set the task of determining the means of translation (cognitive and mental operations and language transformations) that contribute to the achievement of the most equivalent correspondences of such typologically different languages as Russian and Arabic. The work summarizes and analyzes the accumulated exper
... Show Moreorder to increase the level of security, as this system encrypts the secret image before sending it through the internet to the recipient (by the Blowfish method). As The Blowfish method is known for its efficient security; nevertheless, the encrypting time is long. In this research we try to apply the smoothing filter on the secret image which decreases its size and consequently the encrypting and decrypting time are decreased. The secret image is hidden after encrypting it into another image called the cover image, by the use of one of these two methods" Two-LSB" or" Hiding most bits in blue pixels". Eventually we compare the results of the two methods to determine which one is better to be used according to the PSNR measurs