The Twofish cipher is a very powerful algorithm with a fairly complex structure that permeates most data parsing and switching and can be easily implemented. The keys of the Twofish algorithm are of variable length (128, 192, or 256 bits), and the key schedule is generated once and repeated in encrypting all message blocks, whatever their number, and this reduces the confidentiality of encryption. This article discusses the process of generating cipher keys for each block. This concept is new and unknown in all common block cipher algorithms. It is based on the permanent generation of sub keys for all blocks and the key generation process, each according to its work. The Geffe's Generator is used to generate subkeys to make each explicit block a new key that differs from block to block, gaining protection against attacks. Finally, this algorithm works almost like a One-Time Pad.
Abstract This study seeks to deal academically with how the EU treats clandestine immigration, through adopting a purley security approach, based on the European understanding of security threats posed to the security of communities and States in EU at all levels. So they agreed upon criminalizing this threat within the bloc while using repressive tools and steps to limit illegal immigrants flow to European territories. Accordingly, the EU gave the phenomen a security character. So it takes it from low politics level , that of employment and economic field to that of high politics, as a new security problem lying within a new security language embraced by ruling European elites, in other words EU touched on this issue as a speech act emp
... Show MoreResearching the effects of the research and technological development contract, determining its extent and demarcating the boundaries of the obligations imposed in it, is the cornerstone of economic growth and development, because defining these obligations removes the ambiguity and conflict between interests, by stating the rights owed to each party and even trying to reconcile them, or impose protection by specifying guarantees that are compatible with the essence of the R&D contract, For the purpose of studying the subject thoroughly, we will divide this research into two sections. The first is devoted to identifying the parties to the research and technological development contract. As for the other topic, we will explain the obligation
... Show MoreIn this study, genetic algorithm was used to predict the reaction kinetics of Iraqi heavy naphtha catalytic reforming process located in Al-Doura refinery in Baghdad. One-dimensional steady state model was derived to describe commercial catalytic reforming unit consisting of four catalytic reforming reactors in series process.
The experimental information (Reformate composition and output temperature) for each four reactors collected at different operating conditions was used to predict the parameters of the proposed kinetic model. The kinetic model involving 24 components, 1 to 11 carbon atoms for paraffins and 6 to 11 carbon atom for naphthenes and aromatics with 71 reactions. The pre-exponential Arrhenius constants and a
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThe first successful implementation of Artificial Neural Networks (ANNs) was published a little over a decade ago. It is time to review the progress that has been made in this research area. This paper provides taxonomy for classifying Field Programmable Gate Arrays (FPGAs) implementation of ANNs. Different implementation techniques and design issues are discussed, such as obtaining a suitable activation function and numerical truncation technique trade-off, the improvement of the learning algorithm to reduce the cost of neuron and in result the total cost and the total speed of the complete ANN. Finally, the implementation of a complete very fast circuit for the pattern of English Digit Numbers NN has four layers of 70 nodes (neurons) o
... Show MoreThe first successful implementation of Artificial Neural Networks (ANNs) was published a little over a decade ago. It is time to review the progress that has been made in this research area. This paper provides taxonomy for classifying Field Programmable Gate Arrays (FPGAs) implementation of ANNs. Different implementation techniques and design issues are discussed, such as obtaining a suitable activation function and numerical truncation technique trade-off, the improvement of the learning algorithm to reduce the cost of neuron and in result the total cost and the total speed of the complete ANN. Finally, the implementation of a complete very fast circuit for the pattern of English Digit Numbers NN has four layers of 70 nodes (neurons) o
... Show More