Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
A new definition of a graph called Pure graph of a ring denote Pur(R) was presented , where the vertices of the graph represent the elements of R such that there is an edge between the two vertices ???? and ???? if and only if ????=???????? ???????? ????=????????, denoted by pur(R) . In this work we studied some new properties of pur(R) finally we defined the complement of pur(R) and studied some of it is properties
In our work present, the application of strong-Lensing observations for some gravitational lenses have been adopted to study the geometry of the universe and to explain the physics and the size of the quasars. The first procedure was to study the geometrical of the Lensing system to determine the relation between the redshift of the gravitational observations with its distances. The second procedure was to compare between the angular diameter distances "DA" calculated from the Euclidean case with that from the Freedman models, then evaluating the diameter of the system lens. The results concluded that the phenomena are restricted to the ratio of distance between lens and source with the diameter of the lens noticing.
Among a variety of approaches introduced in the literature to establish duality theory, Fenchel duality was of great importance in convex analysis and optimization. In this paper we establish some conditions to obtain classical strong Fenchel duality for evenly convex optimization problems defined in infinite dimensional spaces. The objective function of the primal problem is a family of (possible) infinite even convex functions. The strong duality conditions we present are based on the consideration of the epigraphs of the c-conjugate of the dual objective functions and the ε-c-subdifferential of the primal objective functions.