The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
This paper presents a robust algorithm for the assessment of risk priority for medical equipment based on the calculation of static and dynamic risk factors and Kohnen Self Organization Maps (SOM). Four risk parameters have been calculated for 345 medical devices in two general hospitals in Baghdad. Static risk factor components (equipment function and physical risk) and dynamics risk components (maintenance requirements and risk points) have been calculated. These risk components are used as an input to the unsupervised Kohonen self organization maps. The accuracy of the network was found to be equal to 98% for the proposed system. We conclude that the proposed model gives fast and accurate assessment for risk priority and it works as p
... Show MoreAbstract. This work presents a detailed design of a three-jointed tendon-driven robot finger with a cam/pulleys transmission and joint Variable Stiffness Actuator (VSA). The finger motion configuration is obtained by deriving the cam/pulleys transmission profile as a mathematical solution that is then implemented to achieve contact force isotropy on the phalanges. A VSA is proposed, in which three VSAs are designed to act as a muscle in joint space to provide firm grasping. As a mechatronic approach, a suitable type and number of force sensors and actuators are designed to sense the touch, actuate the finger, and tune the VSAs. The torque of the VSAs is controlled utilizing a designed Multi Input Multi Output (MIMO) fuzzy controll
... Show MoreIn this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MoreCryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is a
... Show MoreIn this research, the focus was placed on estimating the parameters of the Hypoexponential distribution function using the maximum likelihood method and genetic algorithm. More than one standard, including MSE, has been adopted for comparison by Using the simulation method
The low-pressure sprinklers have been widely used to replace the high-pressure impact sprinklers in the lateral move sprinkler irrigation system due to its low operating cost and high efficiency. However, runoff losses under the low-pressure sprinkler irrigation machine can be significant. This study aims to evaluate the performance of the variable pulsed irrigation algorithm (VPIA) in reducing the runoff losses under low-pressure lateral move sprinkler irrigation machine for three different soil types. The VPIA uses the ON-OFF pulsing technique to reduce the runoff losses by controlling the number and width of the pulses considering the soil and the irrigation machine properties. Als
The search aims to clarify pollution to negative effects on environment and to an increasing in the dangerous polluted materials that discharged out these factories. To make active procedures in order to limit the environmental pollution.
The search problem came from an assumption which has the researched factory is suffering from the lack of applying the international specification ( ISO 14004 ). The research problem assimilated by these questions:
- What is the level or organization in thinking of environmental system according to ISO 14004 .
- What are the requirements used in researched factor
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show More