The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreGiven the importance of ecology and its entry into various fields in general and the urban environment particularly; ecological cities take wide ranges of application at multiple regional and global levels. However, it repeatedly noted that there was a state of cognitive confusion and overlapping in the term ecology comes from the diversity of implementation within several disciplines. Architects, designers, and planners have instilled biological development directly into the formal principles as well as the social structures of the ecological cities. Therefore, the research presents a rapid review of the most relevant areas that dealt with the ecological cities by research and analysis at various levels, from the concept and definition of
... Show MoreGiven the importance of ecology and its entry into various fields in general and the urban environment particularly; ecological cities take wide ranges of application at multiple regional and global levels. However, it repeatedly noted that there was a state of cognitive confusion and overlapping in the term ecology comes from the diversity of implementation within several disciplines. Architects, designers, and planners have instilled biological development directly into the formal principles as well as the social structures of the ecological cities. Therefore, the research presents a rapid review of the most relevant areas that dealt with the ecological cities by research and analys
The Hbl toxin is a three-component haemolytic complex produced by Bacillus cereus sensu lato strains and implicated as a cause of diarrhoea in B. cereus food poisoning. While the structure of the HblB component of this toxin is known, the structures of the other components are unresolved. Here, we describe the expression of the recombinant HblL1 component and the elucidation of its structure to 1.36 Å. Like HblB, it is a member of the alpha-helical pore-forming toxin family. In comparison to other members of this group, it has an extended hydrophobic beta tongue region that may be involved in pore formation. Molecular docking was used to predict possible interactions between HblL1 and HblB, and suggests a head to tail dimer might f
... Show MoreSteganography is an important class of security which is widely used in computer and network security nowadays. In this research, a new proposed algorithm was introduced with a new concept of dealing with steganography as an algorithmic secret key technique similar to stream cipher cryptographic system. The proposed algorithm is a secret key system suggested to be used in communications for messages transmission steganography
The undetected error probability is an important measure to assess the communication reliability provided by any error coding scheme. Two error coding schemes namely, Joint crosstalk avoidance and Triple Error Correction (JTEC) and JTEC with Simultaneous Quadruple Error Detection (JTEC-SQED), provide both crosstalk reduction and multi-bit error correction/detection features. The available undetected error probability model yields an upper bound value which does not give accurate estimation on the reliability provided. This paper presents an improved mathematical model to estimate the undetected error probability of these two joint coding schemes. According to the decoding algorithm the errors are classified into patterns and their decoding
... Show MoreIn aspect-based sentiment analysis ABSA, implicit aspects extraction is a fine-grained task aim for extracting the hidden aspect in the in-context meaning of the online reviews. Previous methods have shown that handcrafted rules interpolated in neural network architecture are a promising method for this task. In this work, we reduced the needs for the crafted rules that wastefully must be articulated for the new training domains or text data, instead proposing a new architecture relied on the multi-label neural learning. The key idea is to attain the semantic regularities of the explicit and implicit aspects using vectors of word embeddings and interpolate that as a front layer in the Bidirectional Long Short-Term Memory Bi-LSTM. First, we
... Show MoreDecision-making in Operations Research is the main point in various studies in our real-life applications. However, these different studies focus on this topic. One drawback some of their studies are restricted and have not addressed the nature of values in terms of imprecise data (ID). This paper thus deals with two contributions. First, decreasing the total costs by classifying subsets of costs. Second, improving the optimality solution by the Hungarian assignment approach. This newly proposed method is called fuzzy sub-Triangular form (FS-TF) under ID. The results obtained are exquisite as compared with previous methods including, robust ranking technique, arithmetic operations, magnitude ranking method and centroid ranking method. This
... Show More