To achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DNA sequences. These DNA sequences are converted to RNA sequences. Finally, the RNA sequences are converted to the amino acid, where this sequence is considered as ciphertext to be sent to the receiver. The decryption phase also includes six steps, which are the same encryption steps but in reverse order. It starts with converting amino acid to RNA sequences, then converting RNA sequences to DNA sequences and converting them to their equivalent complementary DNA. After that, DNA sequences are converted to binary values and to their equivalent ASCII values. The final step is converting ASCII values to alphabet characters that are considered plaintext. For evaluation purposes, six text files with different sizes have been used as a test material. Performance evaluation is calculated based on encryption time and decryption time. The achieved results are considered as good and fast, where the encryption and decryption times needed for a file with size of 1k are equal to 2.578 ms and 2.625 ms respectively, while the encryption and decryption times for a file with size of 20k are equal to 268.422 ms and 245.469 ms respectively.
CO2 Gas is considered one of the unfavorable gases and it causes great air pollution. It’s possible to decrease this pollution by injecting gas in the oil reservoirs to provide a good miscibility and to increase the oil recovery factor. MMP was estimated by Peng Robinson equation of state (PR-EOS). South Rumila-63 (SULIAY) is involved for which the miscible displacement by is achievable based on the standard criteria for success EOR processes. A PVT report was available for the reservoir under study. It contains deferential liberation (DL) and constant composition expansion (CCE) tests. PVTi software is one of the (Eclipse V.2010) software’s packages, it has been used to achieve the goal.
... Show MoreElectrospun nanofiber membranes are employed in a variety of applications due to its unique features. the nanofibers' characterizations are effected by the polymer solution. The used solvent for dissolving the polymer powder is critical in preparing the precursor solution. In this paper, the Polyacrylonitrile (PAN)-based nanofibers were prepared in a concentration of 10 wt.% using various solvents (NMP, DMF, and DMSO). The surface morphology, porosity, and the mechanical strength of the three prepared 10 wt.% PAN-based nanofibers membranes (PAN/NMP, PAN/DMF, and PAN/DMSO) were characterized using the Scanning Electron Microscopy (SEM), Dry-wet Weights method, and Dynamic Mechanical Analyzer (DMA). Using DMF as a solvent resulted in a lon
... Show MoreThis article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreCO2 Gas is considered one of the unfavorable gases and it causes great air pollution. It’s possible to decrease this pollution by injecting gas in the oil reservoirs to provide a good miscibility and to increase the oil recovery factor. MMP was estimated by Peng Robinson equation of state (PR-EOS). South Rumila-63 (SULIAY) is involved for which the miscible displacement by is achievable based on the standard criteria for success EOR processes. A PVT report was available for the reservoir under study. It contains deferential liberation (DL) and constant composition expansion (CCE) tests. PVTi software is one of the (Eclipse V.2010) software’s packages, it has been used to achieve the goal. Many trials have been done to ma
... Show MoreAbstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r
... Show MoreOne of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services through our ca
... Show MoreThe research aim was to observe the distribution pattern of