The technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime numbers and the possibility of using integer numbers. While the second branch of the proposal is the multi-key encryption algorithm. The current algorithm provides the ability to use more than two keys. Keys can be any kind of integer number (at least the last key is a prime number), not necessarily to be of the same length. The Encryption process is based on converting the text characters to suggested integer numbers, and these numbers are converted to other numbers by using a multilevel mathematical model many times (a multilevel process depending on the number of keys used), while the decryption process is a one-level process using just one key as the main key, while the other keys used as secondary keys. The messages are encoded before encryption (coded by ASCII or any suggested system). The algorithm can use an unlimited number of keys with a very large size (more than 7500 bytes), at least one of them a prime number. Exponentiation is also used for keys to increase complexity. The experiments proved the robustness of the key exchange protocol and the encryption algorithm in addition to the security. Comparing the suggested method with other methods ensures that the suggested method is more secure and flexible and easy to implement.
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show More In this research study, I tried to trace the epic effect to learn how it was understood and how it was used. Following the descriptive and analytical approach in the research, the first chapter dealt with a presentation of the methodological framework of the problem, the goal, the limits of the research, the importance and the need for it and the definition of terms, as well as the theoretical framework which consisted of two topics, including the impact of the epic theater on the world theater and the second the effect of the epic theater on the Arab theater, This came by tracing the epic impact on the world stage of the Greeks, the Middle Ages, the Renaissance, and the Arab theater of the twentieth century.
As for the second
The research deals with the concept of stigma as one of the important phenomena that cast a shadow over the nature of the individual, his being and his personality through the inferior view with which he confronts in society, and (Sartors) indicates in this regard that stigma may lead to negative discrimination that leads to many defects, in terms of obtaining On care, poor health, service, and frequent setbacks that can damage self-esteem. The first roots of this phenomenon go back to the Greek civilization and what the Greeks used to burn and cut off some parts of the body and then announce to the nation that the bearer of this sign is a criminal. In addition to the Arab peoples living from setbacks that contributed to the exacerbation
... Show MoreIn this paper, the goal of proposed method is to protect data against different types of attacks by unauthorized parties. The basic idea of proposed method is generating a private key from a specific features of digital color image such as color (Red, Green and Blue); the generating process of private key from colors of digital color image performed via the computing process of color frequencies for blue color of an image then computing the maximum frequency of blue color, multiplying it by its number and adding process will performed to produce a generated key. After that the private key is generated, must be converting it into the binary representation form. The generated key is extracted from blue color of keyed image then we selects a c
... Show MoreIn this study, a total of 209 individuals of leeches were collected from Al-Hindyia River / Babil Province. 116 individuals were identified as Erpobdella octaculata (Linnaeus, 1758), 50 individuals as Erpobdella punctata (Leidy,1870) and 43 individuals as Hemiclepsis marginata (Müller, 1774). Four samples were collected monthly during a period from February to June 2018. Some physical and chemical water properties were also examined, including air and water temperature, potential of hydrogen pH, Electrical Conductivity EC, Total Dissolved Solid TDS, Dissolved Oxygen DO, and the Biological Oxygen Demand BOD₅. Air and water temperature were r
... Show MoreThis paper provides an identification key to the species of Orthetrum Newman, 1833 (Odonata, Libellulidae), including six species that were collected from different localities in Iraq.
The species of O. anceps (Schneider, 1845) is registered as a new record in Iraq; the most important characters which are used in diagnostic key are included
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
To improve the efficiency of a processor in recent multiprocessor systems to deal with data, cache memories are used to access data instead of main memory which reduces the latency of delay time. In such systems, when installing different caches in different processors in shared memory architecture, the difficulties appear when there is a need to maintain consistency between the cache memories of different processors. So, cache coherency protocol is very important in such kinds of system. MSI, MESI, MOSI, MOESI, etc. are the famous protocols to solve cache coherency problem. We have proposed in this research integrating two states of MESI's cache coherence protocol which are Exclusive and Modified, which responds to a request from reading
... Show MoreAd-Hoc Networks are a generation of networks that are truly wireless, and can be easily constructed without any operator. There are protocols for management of these networks, in which the effectiveness and the important elements in these networks are the Quality of Service (QoS). In this work the evaluation of QoS performance of MANETs is done by comparing the results of using AODV, DSR, OLSR and TORA routing protocols using the Op-Net Modeler, then conduct an extensive set of performance experiments for these protocols with a wide variety of settings. The results show that the best protocol depends on QoS using two types of applications (+ve and –ve QoS in the FIS evaluation). QoS of the protocol varies from one prot
... Show MoreClassical cryptography systems exhibit major vulnerabilities because of the rapid development of quan tum computing algorithms and devices. These vulnerabilities were mitigated utilizing quantum key distribution (QKD), which is based on a quantum no-cloning algorithm that assures the safe generation and transmission of the encryption keys. A quantum computing platform, named Qiskit, was utilized by many recent researchers to analyze the security of several QKD protocols, such as BB84 and B92. In this paper, we demonstrate the simulation and implementation of a modified multistage QKD protocol by Qiskit. The simulation and implementation studies were based on the “local_qasm” simulator and the “FakeVigo” backend, respectively. T
... Show More