The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
Pathology reports are necessary for specialists to make an appropriate diagnosis of diseases in general and blood diseases in particular. Therefore, specialists check blood cells and other blood details. Thus, to diagnose a disease, specialists must analyze the factors of the patient’s blood and medical history. Generally, doctors have tended to use intelligent agents to help them with CBC analysis. However, these agents need analytical tools to extract the parameters (CBC parameters) employed in the prediction of the development of life-threatening bacteremia and offer prognostic data. Therefore, this paper proposes an enhancement to the Rabin–Karp algorithm and then mixes it with the fuzzy ratio to make this algorithm suitable
... Show MoreManufacturing systems of the future foresee the use of intelligent vehicles, optimizing and navigating. The navigational problem is an important and challenging problem in the field of robotics. The robots often find themselves in a situation where they must find a trajectory to another position in their environment, subject to constraints posed by obstacles and the capabilities of the robot itself. On-line navigation is a set of algorithms that plans and executes a trajectory at the same time. The system adopted in this research searches for a robot collision-free trajectory in a dynamic environment in which obstacles can move while the robot was moving toward the target. So, the ro
... Show MoreThe histological structure of Pycnonotus leucotis was investigated to fill the dearth of information on the histology of mid-brain from available literature and help understand its brain. The brain is wide and short and its length 1.5 cm, and it consists of three regions. The middle region is the mesencephalon. The mesencephalon was divided into optic tectum and tegmentum. The optic tectum consists of six main layers, while the tegmentum contains nuclei of cranial nerves.
This research discusses the subject of identity in the urban environment as it attempts to answer a number of questions that come with the concept of identity. The first of these questions: What is identity? Can a definition or conceptual framework be developed for identity? What about individual, collective, cultural, ethnic, political and regional identity? Is there a definition of identity in the urban environment in particular? If there is a definition of identity, what about social mobility responsible for social change? How can we see identity through this kinetics? Can we assume that identity in the urban environment has a variable structure or is of variable shape with a more stable structure? Can we determine the spatial-tempora
... Show MoreThe study focused on the results of first paleostress from thrust fault slip data on Tertiary age of Hemrin North Structure, North of Iraq. The stress inversion was performed for fault slip data using an improved right dihedral model, and then followed by rotational optimization (Georient Software). The trend of the principal stress axes (σ1, σ2 and σ3) and the ratio of the principal stress differences (R) show the main paleostress field is NE-SW compression regime. As well as using Lisle graph and Mohr diagram to determine the magnitudes of palestress. The values paleostress of the study area were σ1=1430 bars, σ2=632 bars and σ3=166 bar. The large magnitudes of the primary stress axes could be attributed to active tecto
... Show MoreIn this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm
The integration of decision-making will lead to the robust of its decisions, and then determination optimum inventory level to the required materials to produce and reduce the total cost by the cooperation of purchasing department with inventory department and also with other company,s departments. Two models are suggested to determine Optimum Inventory Level (OIL), the first model (OIL-model 1) assumed that the inventory level for materials quantities equal to the required materials, while the second model (OIL-model 2) assumed that the inventory level for materials quantities more than the required materials for the next period. &nb
... Show More