Symmetric cryptography forms the backbone of secure data communication and storage by relying on the strength and randomness of cryptographic keys. This increases complexity, enhances cryptographic systems' overall robustness, and is immune to various attacks. The present work proposes a hybrid model based on the Latin square matrix (LSM) and subtractive random number generator (SRNG) algorithms for producing random keys. The hybrid model enhances the security of the cipher key against different attacks and increases the degree of diffusion. Different key lengths can also be generated based on the algorithm without compromising security. It comprises two phases. The first phase generates a seed value that depends on producing a randomly predefined set of key numbers of size n via the Donald E. Knuths SRNG algorithm (subtractive method). The second phase uses the output key (or seed value) from the previous phase as input to the Latin square matrix (LSM) to formulate a new key randomly. To increase the complexity of the generated key, another new random key of the same length that fulfills Shannon’s principle of confusion and diffusion properties is XORed. Four test keys for each 128, 192,256,512, and 1024–bit length are used to evaluate the strength of the proposed model. The experimental results and security analyses revealed that all test keys met the statistical National Institute of Standards (NIST) standards and had high values for entropy values exceeding 0.98. The key length of the proposed model for n bits is 25*n, which is large enough to overcome brute-force attacks. Moreover, the generated keys are very sensitive to initial values, which increases the complexity against different attacks.
Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreThe Jeribe Formation, the Jambour oil field, is the major carbonate reservoir from the tertiary reservoirs of the Jambour field in northern Iraq, including faults. Engineers have difficulty organizing carbonate reserves since they are commonly tight and heterogeneous. This research presents a geological model of the Jeribe reservoir based on its facies and reservoir characterization data (Permeability, Porosity, Water Saturation, and Net to Gross). This research studied four wells. The geological model was constructed with the Petrel 2020.3 software. The structural maps were developed using a structural contour map of the top of the Jeribe Formation. A pillar grid model with horizons and layering was designed for each zone. Followin
... Show MoreI found that it does not meet some of the requirements, including browsing and organizing structural elements, which is something in which the researcher found a scope for research, and from here she can formulate the problem of her research with the following question: Is there an actual need to develop user interface designs in the websites of Iraqi colleges of fine arts? The research included four chapters (the first chapter - the research problem - the second chapter (theoretical framework), which included three sections, the first is to identify the user interface, the second topic is the structural elements, and the third topic includes the rules of interface design and the dimensions of interaction), as well as the third chapter i
... Show MoreIndustrial Investment according to Clean Productive methods is an important element in the process of rational use of Economic Resources, and the Iraqi industrial sector relied on traditional production methods; the productive activities in this sector did not take into consideration the environmental dimension, which leads to achieving the optimal use of economic resources, so it was necessary to have new investment trends heading with Clean Production. Therefore, the research is based on the hypothesis that "Clean Production contributes to improving the environment and rational use of Natural Resources." Based on the descriptive - inductive analysis methodology that study of Iraqi industries with Clean Production,
... Show MoreThe research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used
Free radical formation in heme proteins is recognized as a factor in mediating the toxicity of many chemicals. The present study was designed to evaluate the dose-response relationship of the free radical scavenging properties of pentoxifylline in nitrite-induced Hb oxidation. Different concentrations of pentoxifylline were added at different time intervals of Hb oxidation in erythrocytes lysate, and formation of methemoglobin (MetHb) was monitored spectrophotometrically. The results showed that in this model, pentoxifylline successfully attenuates Hb oxidation after challenge with sodium nitrite; this protective effect was found to be not related to the catalytic stage of Hb oxidation, th
... Show More
The article critically analyzes traditional translation models. The most influential models of translation in the second half of the 20th century have been mentioned, among which the theory of formal and dynamic equivalence, the theory of regular correspondences, informative, situational-denotative, functional-pragmatic theory of communication levels have been considered. The selected models have been analyzed from the point of view of the universality of their use for different types and types of translation, as well as the ability to comprehend the deep links established between the original and the translation.
Аннотация