This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to 92.19%, availability 92.375%, and reliability 100%. The proposed system managed five devices. The NSFM implemented using Java and C# languages.
Diyala Governorate was recently exposed to high flood waves discharged from Hemrin Dam. Since the dam was at its full capacity during the flood period, these waves were discharged to the Diyala River. Because of the reduction in Diyala River capacity to 750m3/s, the cities and villages on both sides of the river banks were inundated. Thus, the study's objective is to design a flood escape out of the Diyala River, to discharge the flood wave through it. The flood escape simulation was done by using HEC- RAS software. Two hundred twenty-three cross sections for the escape and 30 cross-sections of the Diyala River were used as geometric data. Depending on the geological formation that the escape passed t
... Show MoreComputer systems and networks are increasingly used for many types of applications; as a result the security threats to computers and networks have also increased significantly. Traditionally, password user authentication is widely used to authenticate legitimate user, but this method has many loopholes such as password sharing, brute force attack, dictionary attack and more. The aim of this paper is to improve the password authentication method using Probabilistic Neural Networks (PNNs) with three types of distance include Euclidean Distance, Manhattan Distance and Euclidean Squared Distance and four features of keystroke dynamics including Dwell Time (DT), Flight Time (FT), mixture of (DT) and (FT), and finally Up-Up Time (UUT). The resul
... Show MoreBiometrics represent the most practical method for swiftly and reliably verifying and identifying individuals based on their unique biological traits. This study addresses the increasing demand for dependable biometric identification systems by introducing an efficient approach to automatically recognize ear patterns using Convolutional Neural Networks (CNNs). Despite the widespread adoption of facial recognition technologies, the distinct features and consistency inherent in ear patterns provide a compelling alternative for biometric applications. Employing CNNs in our research automates the identification process, enhancing accuracy and adaptability across various ear shapes and orientations. The ear, being visible and easily captured in
... Show MoreWith the development of high-speed network technologies, there has been a recent rise in the transfer of significant amounts of sensitive data across the Internet and other open channels. The data will be encrypted using the same key for both Triple Data Encryption Standard (TDES) and Advanced Encryption Standard (AES), with block cipher modes called cipher Block Chaining (CBC) and Electronic CodeBook (ECB). Block ciphers are often used for secure data storage in fixed hard drives, portable devices, and safe network data transport. Therefore, to assess the security of the encryption method, it is necessary to become familiar with and evaluate the algorithms of cryptographic systems. Block cipher users need to be sure that the ciphers the
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreThe study aims to measure life management (in its various approaches) among university academics and to identify the differences among life management strategies (in its various approaches) in regard of gender (male, female), field of education (scientific, humanitarian) and title (assistant lecturer, lecturer, assistant professor, and professor). The study sample consisted of (246) male and female academics from four colleges (Arts, Science, Science education, and Humanitarian Science Education). To achieve the goals of the study, the researchers used the Freund & Baltes (2002) life-management strategies scale after translating it and determining its parameters of reliability and validity. The statistical analysis of the
... Show MoreAll domains in this world built based on set of theories. Those theories have been responsible for interpretation of some phenomena and interlocking relationships around us. Commonly, theories are categorized by which aspect is believed. Also,can find a group of theories produced regardingto the leadership domain. This paper will be presented some of these theories like (Great Man Theory, Trait Theory, Behavioral Theories, Contingency Theories, Situational Theory, Transactional Theories, and Transformational Theories). where these theories considered the most popular and common in the field of leader and will be discussed by this work
Solid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show More