The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
Surface electromyography (sEMG) and accelerometer (Acc) signals play crucial roles in controlling prosthetic and upper limb orthotic devices, as well as in assessing electrical muscle activity for various biomedical engineering and rehabilitation applications. In this study, an advanced discrimination system is proposed for the identification of seven distinct shoulder girdle motions, aimed at improving prosthesis control. Feature extraction from Time-Dependent Power Spectrum Descriptors (TDPSD) is employed to enhance motion recognition. Subsequently, the Spectral Regression (SR) method is utilized to reduce the dimensionality of the extracted features. A comparative analysis is conducted between the Linear Discriminant Analysis (LDA) class
... Show MoreThis study presents determination of the paleostress magnitudes and orientation of Bekhme Structure in Shaqlawa area northeastern Iraq. Paleostress Analysis of slip-fault measurements is performed using Right dihedral, Lisle diagram and Mohr Circles methods. Depending on Mohr Circles, Bott law and vertical thickness, the magnitudes of the paleostress at the time of the tectonic activity were determined. Firstly, Georient Software was used to estimate the orientation of the paleostresses (σ1, σ2 and σ3). Secondly, using the rupture –friction law, taking into account depth of the overburden and the vertical stress (σv) was calculated to determine the magnitude of the paleostresses (σ1=4500 bars, σ2=1
... Show MoreThe performance quality and searching speed of Block Matching (BM) algorithm are affected by shapes and sizes of the search patterns used in the algorithm. In this paper, Kite Cross Hexagonal Search (KCHS) is proposed. This algorithm uses different search patterns (kite, cross, and hexagonal) to search for the best Motion Vector (MV). In first step, KCHS uses cross search pattern. In second step, it uses one of kite search patterns (up, down, left, or right depending on the first step). In subsequent steps, it uses large/small Hexagonal Search (HS) patterns. This new algorithm is compared with several known fast block matching algorithms. Comparisons are based on search points and Peak Signal to Noise Ratio (PSNR). According to resul
... Show MoreDigital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreString matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show MoreIdentification of complex communities in biological networks is a critical and ongoing challenge since lots of network-related problems correspond to the subgraph isomorphism problem known in the literature as NP-hard. Several optimization algorithms have been dedicated and applied to solve this problem. The main challenge regarding the application of optimization algorithms, specifically to handle large-scale complex networks, is their relatively long execution time. Thus, this paper proposes a parallel extension of the PSO algorithm to detect communities in complex biological networks. The main contribution of this study is summarized in three- fold; Firstly, a modified PSO algorithm with a local search operator is proposed
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
Sustainable plant protection and the economy of plant crops worldwide depend heavily on the health of agriculture. In the modern world, one of the main factors influencing economic growth is the quality of agricultural produce. The need for future crop protection and production is growing as disease-affected plants have caused considerable agricultural losses in several crop categories. The crop yield must be increased while preserving food quality and security and having the most negligible negative environmental impact. To overcome these obstacles, early discovery of satisfactory plants is critical. The use of Advances in Intelligent Systems and information computer science effectively helps find more efficient and low-cost solutions. Thi
... Show More