Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.
In our work present, the application of strong-Lensing observations for some gravitational lenses have been adopted to study the geometry of the universe and to explain the physics and the size of the quasars. The first procedure was to study the geometrical of the Lensing system to determine the relation between the redshift of the gravitational observations with its distances. The second procedure was to compare between the angular diameter distances "DA" calculated from the Euclidean case with that from the Freedman models, then evaluating the diameter of the system lens. The results concluded that the phenomena are restricted to the ratio of distance between lens and source with the diameter of the lens noticing.
Among a variety of approaches introduced in the literature to establish duality theory, Fenchel duality was of great importance in convex analysis and optimization. In this paper we establish some conditions to obtain classical strong Fenchel duality for evenly convex optimization problems defined in infinite dimensional spaces. The objective function of the primal problem is a family of (possible) infinite even convex functions. The strong duality conditions we present are based on the consideration of the epigraphs of the c-conjugate of the dual objective functions and the ε-c-subdifferential of the primal objective functions.
Abstract
Hexapod robot is a flexible mechanical robot with six legs. It has the ability to walk over terrain. The hexapod robot look likes the insect so it has the same gaits. These gaits are tripod, wave and ripple gaits. Hexapod robot needs to stay statically stable at all the times during each gait in order not to fall with three or more legs continuously contacts with the ground. The safety static stability walking is called (the stability margin). In this paper, the forward and inverse kinematics are derived for each hexapod’s leg in order to simulate the hexapod robot model walking using MATLAB R2010a for all gaits and the geometry in order to derive the equations of the sub-constraint workspaces for each
... Show MoreNurse scheduling problem is one of combinatorial optimization problems and it is one of NP-Hard problems which is difficult to be solved as optimal solution. In this paper, we had created an proposed algorithm which it is hybrid simulated annealing algorithm to solve nurse scheduling problem, developed the simulated annealing algorithm and Genetic algorithm. We can note that the proposed algorithm (Hybrid simulated Annealing Algorithm(GS-h)) is the best method among other methods which it is used in this paper because it satisfied minimum average of the total cost and maximum number of Solved , Best and Optimal problems. So we can note that the ratios of the optimal solution are 77% for the proposed algorithm(GS-h), 28.75% for Si
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreInformation security in data storage and transmission is increasingly important. On the other hand, images are used in many procedures. Therefore, preventing unauthorized access to image data is crucial by encrypting images to protect sensitive data or privacy. The methods and algorithms for masking or encoding images vary from simple spatial-domain methods to frequency-domain methods, which are the most complex and reliable. In this paper, a new cryptographic system based on the random key generator hybridization methodology by taking advantage of the properties of Discrete Cosine Transform (DCT) to generate an indefinite set of random keys and taking advantage of the low-frequency region coefficients after the DCT stage to pass them to
... Show More