In this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox5, the fifth Sbox is formed in GF(28) and it is variable to increase the complexity of the proposed algorithm. The obtained results were tested using many criteria: correlation criteria, number of pixels change rate (NPCR) and mean square error (MSE). These tested factors were approved by the output results which demonstrated that the correlation of image elements in the proposed algorithm was significantly reduced during the encryption operation. Also, the algorithm is very resistant to attempts of breaking the cryptographic key since two keys were used in the encryption/ decryption operations which lead to increase the complexity factor in the proposed algorithm.
Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreDigital image started to including in various fields like, physics science, computer science, engineering science, chemistry science, biology science and medication science, to get from it some important information. But any images acquired by optical or electronic means is likely to be degraded by the sensing environment. In this paper, we will study and derive Iterative Tikhonov-Miller filter and Wiener filter by using criterion function. Then use the filters to restore the degraded image and show the Iterative Tikhonov-Miller filter has better performance when increasing the number of iteration To a certain limit then, the performs will be decrease. The performance of Iterative Tikhonov-Miller filter has better performance for less de
... Show MoreHoneywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreThis paper introduced an algorithm for lossless image compression to compress natural and medical images. It is based on utilizing various casual fixed predictors of one or two dimension to get rid of the correlation or spatial redundancy embedded between image pixel values then a recursive polynomial model of a linear base is used.
The experimental results of the proposed compression method are promising in terms of preserving the details and the quality of the reconstructed images as well improving the compression ratio as compared with the extracted results of a traditional linear predicting coding system.
Artificial fish swarm algorithm (AFSA) is one of the critical swarm intelligent algorithms. In this
paper, the authors decide to enhance AFSA via diversity operators (AFSA-DO). The diversity operators will
be producing more diverse solutions for AFSA to obtain reasonable resolutions. AFSA-DO has been used to
solve flexible job shop scheduling problems (FJSSP). However, the FJSSP is a significant problem in the
domain of optimization and operation research. Several research papers dealt with methods of solving this
issue, including forms of intelligence of the swarms. In this paper, a set of FJSSP target samples are tested
employing the improved algorithm to confirm its effectiveness and evaluate its ex
This study was conducted in College of Science \ Computer Science Department \ University of Baghdad to compare between automatic sorting and manual sorting, which is more efficient and accurate, as well as the use of artificial intelligence in automated sorting, which included artificial neural network, image processing, study of external characteristics, defects and impurities and physical characteristics; grading and sorting speed, and fruits weigh. the results shown value of impurities and defects. the highest value of the regression is 0.40 and the error-approximation algorithm has recorded the value 06-1 and weight fruits fruit recorded the highest value and was 138.20 g, Gradin
The main goal of this work is study the land cover changes for "Baghdad city" over a period of (30) years using multi-temporal Landsat satellite images (TM, ETM+ and OLI) acquired in 1984, 2000, and 2015 respectively. In this work, The principal components analysis transform has been utilized as multi operators, (i.e. enhancement, compressor, and temporal change detector). Since most of the image band's information are presented in the first PCs image. Then, the PC1 image for all three years is partitioned into variable sized blocks using quad tree technique. Several different methods of classification have been used to classify Landsat satellite images; these are, proposed method singular value decomposition (SVD) using Visual Basic sof
... Show MoreThe cost of pile foundations is part of the super structure cost, and it became necessary to reduce this cost by studying the pile types then decision-making in the selection of the optimal pile type in terms of cost and time of production and quality .So The main objective of this study is to solve the time–cost–quality trade-off (TCQT) problem by finding an optimal pile type with the target of "minimizing" cost and time while "maximizing" quality. There are many types In the world of piles but in this paper, the researcher proposed five pile types, one of them is not a traditional, and developed a model for the problem and then employed particle swarm optimization (PSO) algorithm, as one of evolutionary algorithms with t
... Show MoreGas-lift technique plays an important role in sustaining oil production, especially from a mature field when the reservoirs’ natural energy becomes insufficient. However, optimally allocation of the gas injection rate in a large field through its gas-lift network system towards maximization of oil production rate is a challenging task. The conventional gas-lift optimization problems may become inefficient and incapable of modelling the gas-lift optimization in a large network system with problems associated with multi-objective, multi-constrained, and limited gas injection rate. The key objective of this study is to assess the feasibility of utilizing the Genetic Algorithm (GA) technique to optimize t