The effect of time (or corrosion products formation) on corrosion rates of carbon steel pipe in aerated 0.1N NaCl
solution under turbulent flow conditions is investigated. Tests are conducted using electrochemical polarization
technique by determining the limiting current density of oxygen reduction in Reynolds number range of 15000 to 110000
and temperature range of 30 to 60oC. The effect of corrosion products formation on the friction factor is studied and
discussed. Corrosion process is analyzed as a mass transfer operation and the mass transfer theory is employed to
express the corrosion rate. The results are compared with many proposed models particularly those based on the
concept of analogy among momentum, heat, and mass transport. The capability of these models to predict corrosion
rates in presence of corrosion products is examined and discussed. It is found that formation of corrosion products with
time decreases the corrosion rate (or mass transfer rate) at low Reynolds number and temperature while it increases the
corrosion rate at high Re and temperature. It increases momentum transport and this increase depends on temperature,
Reynolds number, and corrosion rate. Increasing roughness due to the formation of corrosion products causes
overestimation of analogy correlations results by increasing friction factor and decreasing corrosion rate.
The drones have become the focus of researchers’ attention because they enter into many details of life. The Tri-copter was chosen because it combines the advantages of the quadcopter in stability and manoeuvrability quickly. In this paper, the nonlinear Tri-copter model is entirely derived and applied three controllers; Proportional-Integral-Derivative (PID), Fractional Order PID (FOPID), and Nonlinear PID (NLPID). The tuning process for the controllers’ parameters had been tuned by using the Grey Wolf Optimization (GWO) algorithm. Then the results obtained had been compared. Where the improvement rate for the Tri-copter model of the nonlinear controller (NLPID) if compared with
Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreCrime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o
This study looks into the many methods that are used in the risk assessment procedure that is used in the construction industry nowadays. As a result of the slow adoption of novel assessment methods, professionals frequently resort to strategies that have previously been validated as being successful. When it comes to risk assessment, having a precise analytical tool that uses the cost of risk as a measurement and draws on the knowledge of professionals could potentially assist bridge the gap between theory and practice. This step will examine relevant literature, sort articles according to their published year, and identify domains and qualities. Consequently, the most significant findings have been presented in a manne
... Show MoreDisease diagnosis with computer-aided methods has been extensively studied and applied in diagnosing and monitoring of several chronic diseases. Early detection and risk assessment of breast diseases based on clinical data is helpful for doctors to make early diagnosis and monitor the disease progression. The purpose of this study is to exploit the Convolutional Neural Network (CNN) in discriminating breast MRI scans into pathological and healthy. In this study, a fully automated and efficient deep features extraction algorithm that exploits the spatial information obtained from both T2W-TSE and STIR MRI sequences to discriminate between pathological and healthy breast MRI scans. The breast MRI scans are preprocessed prior to the feature
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreThe concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present s
... Show More