Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.
In the light of the globalization Which surrounds the business environment and whose impact has been reflected on industrial economic units the whole world has become a single market that affects its variables on all units and is affected by the economic contribution of each economic unit as much as its share. The problem of this research is that the use of Pareto analysis enables industrial economic units to diagnose the risks surrounding them , so the main objective of the research was to classify risks into both internal and external types and identify any risks that require more attention.
The research was based on the hypothesis that Pareto analysis used, risks can be identified and addressed before they occur.
... Show MoreThe process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material
... Show MoreThe Assignment model is a mathematical model that aims to express a real problem facing factories and companies which is characterized by the guarantee of its activity in order to make the appropriate decision to get the best allocation of machines or jobs or workers on machines in order to increase efficiency or profits to the highest possible level or reduce costs or time To the extent possible, and in this research has been using the method of labeling to solve the problem of the fuzzy assignment of real data has been approved by the tire factory Diwaniya, where the data included two factors are the factors of efficiency and cost, and was solved manually by a number of iterations until reaching the optimization solution,
... Show MoreSemi-parametric regression models have been studied in a variety of applications and scientific fields due to their high flexibility in dealing with data that has problems, as they are characterized by the ease of interpretation of the parameter part while retaining the flexibility of the non-parametric part. The response variable or explanatory variables can have outliers, and the OLS approach have the sensitivity to outliers. To address this issue, robust (resistance) methods were used, which are less sensitive in the presence of outlier values in the data. This study aims to estimate the partial regression model using the robust estimation method with the wavel
... Show More
Buildings such as malls, offices, airports and hospitals nowadays have become very complicated which increases the need for a solution that helps people to find their locations in these buildings. GPS or cell signals are commonly used for positioning in an outdoor environment and are not accurate in indoor environment. Smartphones are becoming a common presence in our daily life, also the existing infrastructure, the Wi-Fi access points, which is commonly available in most buildings, has motivated this work to build hybrid mechanism that combines the APs fingerprint together with smartphone barometer sensor readings, to accurately determine the user position inside building floor relative to well-known lan
... Show MoreThis paper shows an approach for Electromyography (ECG) signal processing based on linear and nonlinear adaptive filtering using Recursive Least Square (RLS) algorithm to remove two kinds of noise that affected the ECG signal. These are the High Frequency Noise (HFN) and Low Frequency Noise (LFN). Simulation is performed in Matlab. The ECG, HFN and LFN signals used in this study were downloaded from ftp://ftp.ieee.org/uploads/press/rangayyan/, and then the filtering process was obtained by using adaptive finite impulse response (FIR) that illustrated better results than infinite impulse response (IIR) filters did.
This study investigated the treatment of dairy wastewater using the electrocoagulation method with iron filings as electrodes. The study dealt with real samples collected from local factory for dairy products in Baghdad. The Response Surface Methodology (RSM) was used to optimize five experimental variables at six levels for each variable, for estimating chemical oxygen demand (COD) removal efficiency. These variables were the distance between electrodes, detention time, dosage of NaCl as electrolyte, initial COD concentration, and current density. RSM was investigated the direct and complex interaction effects between parameters to estimate the optimum values. The respective optimum value was 1 cm for the distance between electrodes, (6
... Show MoreThe rotor dynamics generally deals with vibration of rotating structures. For designing rotors of a high speeds, basically its important to take into account the rotor dynamics characteristics. The modeling features for rotor and bearings support flexibility are described in this paper, by taking these characteristics of rotor dynamics features into standard Finite Element Approach (FEA) model. Transient and harmonic analysis procedures have been found by ANSYS, the idea has been presented to deal with critical speed calculation. This papers shows how elements BEAM188 and COMBI214 are used to represent the shaft and bearings, the dynamic stiffness and damping coefficients of journal bearings as a matrices have been found
... Show MoreIts well known that understanding human facial expressions is a key component in understanding emotions and finds broad applications in the field of human-computer interaction (HCI), has been a long-standing issue. In this paper, we shed light on the utilisation of a deep convolutional neural network (DCNN) for facial emotion recognition from videos using the TensorFlow machine-learning library from Google. This work was applied to ten emotions from the Amsterdam Dynamic Facial Expression Set-Bath Intensity Variations (ADFES-BIV) dataset and tested using two datasets.