Playfair cipher is a substitution scheme. The classical playfair scheme has a limited matrix size containing only uppercase letters, so it is prone to hackers and cryptanalysis. To increase the resistance of playfair cipher, a new encipherment and decipherment method is proposed in this work, which depends on the permutation and its inverse, respectively. In addition, a modified key matrix is utilized, which includes capital and small Alphabets, numbers, and 38 special characters collected from ASCII codes. In the proposed method, both substitution and transposition schemes are used, where the first stratum of the cipher is a substitution by using key matrix and the second stratum is a transposition by using permutation key which provides multi strata resistance to brute force and other cryptanalysis attacks. A comparison between the traditional playfair scheme and the proposed method demonstrates that the encoded text is hard to recognize by cryptanalysts, which improves the security of the encryption process.
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreThe paper aims is to solve the problem of choosing the appropriate project from several service projects for the Iraqi Martyrs Foundation or arrange them according to the preference within the targeted criteria. this is done by using Multi-Criteria Decision Method (MCDM), which is the method of Multi-Objective Optimization by Ratios Analysis (MOORA) to measure the composite score of performance that each alternative gets and the maximum benefit accruing to the beneficiary and according to the criteria and weights that are calculated by the Analytic Hierarchy Process (AHP). The most important findings of the research and relying on expert opinion are to choose the second project as the best alternative and make an arrangement acco
... Show MoreThe History of Multi Parties and its Effect on Political System in India
Researchers dream of developing autonomous humanoid robots which behave/walk like a human being. Biped robots, although complex, have the greatest potential for use in human-centred environments such as the home or office. Studying biped robots is also important for understanding human locomotion and improving control strategies for prosthetic and orthotic limbs. Control systems of humans walking in cluttered environments are complex, however, and may involve multiple local controllers and commands from the cerebellum. Although biped robots have been of interest over the last four decades, no unified stability/balance criterion adopted for stabilization of miscellaneous walking/running modes of biped
The main goal of this work is study the land cover changes for "Baghdad city" over a period of (30) years using multi-temporal Landsat satellite images (TM, ETM+ and OLI) acquired in 1984, 2000, and 2015 respectively. In this work, The principal components analysis transform has been utilized as multi operators, (i.e. enhancement, compressor, and temporal change detector). Since most of the image band's information are presented in the first PCs image. Then, the PC1 image for all three years is partitioned into variable sized blocks using quad tree technique. Several different methods of classification have been used to classify Landsat satellite images; these are, proposed method singular value decomposition (SVD) using Visual Basic sof
... Show MoreFaces blurring is one of the important complex processes that is considered one of the advanced computer vision fields. The face blurring processes generally have two main steps to be done. The first step has detected the faces that appear in the frames while the second step is tracking the detected faces which based on the information extracted during the detection step. In the proposed method, an image is captured by the camera in real time, then the Viola Jones algorithm used for the purpose of detecting multiple faces in the captured image and for the purpose of reducing the time consumed to handle the entire captured image, the image background is removed and only the motion areas are processe
... Show More