The technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime numbers and the possibility of using integer numbers. While the second branch of the proposal is the multi-key encryption algorithm. The current algorithm provides the ability to use more than two keys. Keys can be any kind of integer number (at least the last key is a prime number), not necessarily to be of the same length. The Encryption process is based on converting the text characters to suggested integer numbers, and these numbers are converted to other numbers by using a multilevel mathematical model many times (a multilevel process depending on the number of keys used), while the decryption process is a one-level process using just one key as the main key, while the other keys used as secondary keys. The messages are encoded before encryption (coded by ASCII or any suggested system). The algorithm can use an unlimited number of keys with a very large size (more than 7500 bytes), at least one of them a prime number. Exponentiation is also used for keys to increase complexity. The experiments proved the robustness of the key exchange protocol and the encryption algorithm in addition to the security. Comparing the suggested method with other methods ensures that the suggested method is more secure and flexible and easy to implement.
Drilling deviated wells is a frequently used approach in the oil and gas industry to increase the productivity of wells in reservoirs with a small thickness. Drilling these wells has been a challenge due to the low rate of penetration (ROP) and severe wellbore instability issues. The objective of this research is to reach a better drilling performance by reducing drilling time and increasing wellbore stability.
In this work, the first step was to develop a model that predicts the ROP for deviated wells by applying Artificial Neural Networks (ANNs). In the modeling, azimuth (AZI) and inclination (INC) of the wellbore trajectory, controllable drilling parameters, unconfined compressive strength (UCS), formation
... Show MoreThis study aimed to assess the efficiency of Nerium oleander in removing three different metals (Cd, Cu, and Ni) from simulated wastewater using horizontal subsurface flow constructed wetland (HSSF-CW) system. The HSSF-CW pilot scale was operated at two hydraulic retention times (HRTs) of 4 and 7 days, filled with a substrate layer of sand and gravel. The results indicated that the HSSF-CW had high removal efficiency of Cd and Cu. A higher HRT (7 days) resulted in greater removal efficiency reaching up to (99.3% Cd, 99.5% Cu, 86.3% Ni) compared to 4 days. The substrate played a significant role in removal of metals due to adsorption and precipitation. The N. oleander plant also showed a good tolerance to the uptake of Cd, Cu, and Ni ions fr
... Show MoreRoot research requires high throughput phenotyping methods that provide meaningful information on root depth if the full potential of the genomic revolution is to be translated into strategies that maximise the capture of water deep in soils by crops. A very simple, low cost method of assessing root depth of seedlings using a layer of herbicide (
A Novel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twenty four samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.
The Present study investigated the drought in Iraq, by using the rainfall data which obtained from 39 meteorological stations for the past 30 years (1980-2010). The drought coefficient calculated on basis of the standard precipitation index (SPI) and then characteristics of drought magnitude, duration and intensity were analyzed. The correlation and regression between magnitude and duration of drought were obtained according the (SPI) index. The result shows that drought magnitude values were greater in the northeast region of Iraq.
Forest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods
Background: One of the most common problems that encountered is postburn contracture which has both functional and aesthetic impact on the patients. Various surgical methods had being proposed to treat such problem. Aim: To evaluate the effectiveness of square flap in management of postburn contracture in several part of the body. Patients and methods: From April 2019 to June 2020 a total number of 20 patients who had postburn contracture in various parts of their body were subjected to scar contracture release using square flap. The follow up period was ranging between 6 months to 12 months. Results: All of our patients had achieved complete release of their band with maximum postoperative motion together with accepted aesthetic outcome. A
... Show More