Secure information transmission over the internet is becoming an important requirement in data communication. These days, authenticity, secrecy, and confidentiality are the most important concerns in securing data communication. For that reason, information hiding methods are used, such as Cryptography, Steganography and Watermarking methods, to secure data transmission, where cryptography method is used to encrypt the information in an unreadable form. At the same time, steganography covers the information within images, audio or video. Finally, watermarking is used to protect information from intruders. This paper proposed a new cryptography method by using three different keys to make the system harder to break by outsider attackers (where the 1stand 3rdencryptions keys are numerical keys, while the 2ndkey is string). This system is done based on seven steps; the first step is converting the plaintext based on the first generated key that leads to substitute each character in plaintext, the second step is embedding second generated key with the message that want to send, the third step is done by converting text to their equivalent ASCII format. The fourth step is converting these ASCII format to Binary numbers; then, these numbers are shifted based on the third generated key. These binary numbers are converted to ASCII, and the last step is to convert ASCII to their equivalent characters. The achieved text is the ciphertext that will be sent.
In this article, the nonlinear problem of Jeffery-Hamel flow has been solved analytically and numerically by using reliable iterative and numerical methods. The approximate solutions obtained by using the Daftardar-Jafari method namely (DJM), Temimi-Ansari method namely (TAM) and Banach contraction method namely (BCM). The obtained solutions are discussed numerically, in comparison with other numerical solutions obtained from the fourth order Runge-Kutta (RK4), Euler and previous analytic methods available in literature. In addition, the convergence of the proposed methods is given based on the Banach fixed point theorem. The results reveal that the presented methods are reliable, effective and applicable to solve other nonlinear problems.
... Show MoreTransport is a problem and one of the most important mathematical methods that help in making the right decision for the transfer of goods from sources of supply to demand centers and the lowest possible costs, In this research, the mathematical model of the three-dimensional transport problem in which the transport of goods is not homogeneous was constructed. The simplex programming method was used to solve the problem of transporting the three food products (rice, oil, paste) from warehouses to the student areas in Baghdad, This model proved its efficiency in reducing the total transport costs of the three products. After the model was solved in (Winqsb) program, the results showed that the total cost of transportation is (269,
... Show MoreThe present paper concerns with the problem of estimating the reliability system in the stress – strength model under the consideration non identical and independent of stress and strength and follows Lomax Distribution. Various shrinkage estimation methods were employed in this context depend on Maximum likelihood, Moment Method and shrinkage weight factors based on Monte Carlo Simulation. Comparisons among the suggested estimation methods have been made using the mean absolute percentage error criteria depend on MATLAB program.
Text Clustering consists of grouping objects of similar categories. The initial centroids influence operation of the system with the potential to become trapped in local optima. The second issue pertains to the impact of a huge number of features on the determination of optimal initial centroids. The problem of dimensionality may be reduced by feature selection. Therefore, Wind Driven Optimization (WDO) was employed as Feature Selection to reduce the unimportant words from the text. In addition, the current study has integrated a novel clustering optimization technique called the WDO (Wasp Swarm Optimization) to effectively determine the most suitable initial centroids. The result showed the new meta-heuristic which is WDO was employed as t
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreArabic text categorization for pattern recognitions is challenging. We propose for the first time a novel holistic method based on clustering for classifying Arabic writer. The categorization is accomplished stage-wise. Firstly, these document images are sectioned into lines, words, and characters. Secondly, their structural and statistical features are obtained from sectioned portions. Thirdly, F-Measure is used to evaluate the performance of the extracted features and their combination in different linkage methods for each distance measures and different numbers of groups. Finally, experiments are conducted on the standard KHATT dataset of Arabic handwritten text comprised of varying samples from 1000 writers. The results in the generatio
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra