The evolution of cryptography has been crucial to preservation subtle information in the digital age. From early cipher algorithms implemented in earliest societies to recent cryptography methods, cryptography has developed alongside developments in computing field. The growing in cyber threats and the increase of comprehensive digital communications have highlighted the significance of selecting effective and robust cryptographic techniques. This article reviews various cryptography algorithms, containing symmetric key and asymmetric key cryptography, via evaluating them according to security asset, complexity, and execution speed. The main outcomes demonstrate the growing trust on elliptic curve cryptography outstanding its capability and small size, while highlighting the requirement for study in the post-quantum cryptographic field to address the threats rising from quantum computing. The comparative analysis shows a comprehensive understanding that combines classical cryptography algorithms with up-to-date approaches such as chaotic-based system and post-quantum cryptography, confirming that the study addresses the future of cryptography security in the aspect of emerging challenge like quantum computing.
The experiment aimed to compare different methods of measuring the Feed pellet durability through the effect of pellet die speeds and the particle size (mill sieve holes diameter). Feed pellet durability was studied in four different ways: pellet direct measurement (%), pellet lengths (%), pellet water absorption (%), pellet durability by drop box device (%), pellet durability by air pressure device (%). Three pellet die speeds 280, 300, and 320 rpm, three mill sieve holes diameter 2, 4, and 6 mm, have been used. The results showed that increasing the pellet die speeds from 280 to 300 then to 320 rpm led to a significant decrease in the feed pellet durability by direct measurement, drop box device, and air pressure device, while pel
... Show MoreCurrently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of
... Show MoreIn this article, we will present a quasi-contraction mapping approach for D iteration, and we will prove that this iteration with modified SP iteration has the same convergence rate. At the other hand, we prove that the D iteration approach for quasi-contraction maps is faster than certain current leading iteration methods such as, Mann and Ishikawa. We are giving a numerical example, too.
In this paper, the effective computational method (ECM) based on the standard monomial polynomial has been implemented to solve the nonlinear Jeffery-Hamel flow problem. Moreover, novel effective computational methods have been developed and suggested in this study by suitable base functions, namely Chebyshev, Bernstein, Legendre, and Hermite polynomials. The utilization of the base functions converts the nonlinear problem to a nonlinear algebraic system of equations, which is then resolved using the Mathematica®12 program. The development of effective computational methods (D-ECM) has been applied to solve the nonlinear Jeffery-Hamel flow problem, then a comparison between the methods has been shown. Furthermore, the maximum
... Show MoreIn this paper Volterra Runge-Kutta methods which include: method of order two and four will be applied to general nonlinear Volterra integral equations of the second kind. Moreover we study the convergent of the algorithms of Volterra Runge-Kutta methods. Finally, programs for each method are written in MATLAB language and a comparison between the two types has been made depending on the least square errors.
The phytoremediation technique has become very efficient for treating soil contaminated with heavy metals. In this study, a pot experiment was conducted where the Dodonaea plant (known as hops) was grown, and soil previously contaminated with metals (Zn, Ni, Cd) was added at concentrations 100, 50, 0 mg·kg-1 for Ni and Zn, and at concentrations of 0, 5, 10 mg·kg-1 for cadmium. Irrigation was done within the limits of the field capacity of the soil. Cadmium, nickel and zinc was estimated in the soil to find out the capacity of plants to the absorption of heavy and contaminated metals by using bioconcentration factors (BCFs), bioaccumulation coefficient (BAC) and translocation factor (TF). Additionally, BCF values of both Ni and Zn were l
... Show MoreIn this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreIn this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MorePermeability is an essential parameter in reservoir characterization because it is determined hydrocarbon flow patterns and volume, for this reason, the need for accurate and inexpensive methods for predicting permeability is important. Predictive models of permeability become more attractive as a result.
A Mishrif reservoir in Iraq's southeast has been chosen, and the study is based on data from four wells that penetrate the Mishrif formation. This study discusses some methods for predicting permeability. The conventional method of developing a link between permeability and porosity is one of the strategies. The second technique uses flow units and a flow zone indicator (FZI) to predict the permeability of a rock mass u
... Show More