This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to 92.19%, availability 92.375%, and reliability 100%. The proposed system managed five devices. The NSFM implemented using Java and C# languages.
The issue of intellectual capital is a critical issue in the success and excellence of organizations, especially if it is linked to the management of customer experience, and because of the characteristic of the Iraqi environment, which affected all sectors, including the banking sector, it was important to discuss the types of intellectual capital and its relationship to the management of customer experience, especially when managers private banks, the importance of research lies in the linking of its variables to build the strategic capabilities of banks to achieve excellence and sustainability, it aims to diagnose the extent of interest in the types of intellectual capital and management of customer experience in the Iraqi ban
... Show MoreThe research has been based on two main variables (information and communication technology) and the quality of blended education (physical and electronic), aiming to reveal the relationship between four dimensions (physical devices, software, databases, communication networks) and the elements of education represented by (the teacher, the student, the teaching process, curriculum). The methodology and post-analysis-based research were conducted at the Technical College of Management / Baghdad through polling the opinions of a random sample that included (80) teachers out of (86) and the number of students (276) representing a random sample from all departments of the college (for the morning study) out of (3500) stud
... Show MoreAbstract
This Study aimed to investigate the reality of the application of the degree of management practice by Walking Around, namely (discovery of facts, improvement of communication, motivation, development, creativity and feedback) and Its role in achieving organizational justice (procedural justice, distributive justice, Interactional Justice),The availability of the core principles of this system from the point of view of the senior management personnel in the Directorate of Anbar Education, and the importance of research in the philosophical rooting of the nature of the variables as a modern administrative terminology, the research dealt with a problem expressed by a number of intellectual and appl
... Show MoreThe insurance activity in the various countries of the world are important indicators of the strength of the country's economy, he came to study the stages compensation insurance settlement against fire and the importance of investing time risk in achieving the payment of compensation speed, and the implications for the continuation of the insured to carry out insurance with the insurance company, whether working within the public sector or the private sector. Hence the research problem in how quickly insurance companies to pay compensation as soon as possible and the return of the insured to carry out its work and make up for what he died from loss. Intentional sample was selected from branches and divisions and Their assistants manager
... Show MoreThis paper discusses an optimal path planning algorithm based on an Adaptive Multi-Objective Particle Swarm Optimization Algorithm (AMOPSO) for two case studies. First case, single robot wants to reach a goal in the static environment that contain two obstacles and two danger source. The second one, is improving the ability for five robots to reach the shortest way. The proposed algorithm solves the optimization problems for the first case by finding the minimum distance from initial to goal position and also ensuring that the generated path has a maximum distance from the danger zones. And for the second case, finding the shortest path for every robot and without any collision between them with the shortest time. In ord
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreStarting from 4, - Dimercaptobiphenyl, a variety of phenolic Schiff bases (methylolic, etheric, epoxy) derivatives have been synthesized. All proposed structure were supported by FTIR, 1H-NMR, 13C-NMR Elemental analysis all analysis were performed in center of consultation in Jordan Universty.
In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
Image pattern classification is considered a significant step for image and video processing.Although various image pattern algorithms have been proposed so far that achieved adequate classification,achieving higher accuracy while reducing the computation time remains challenging to date. A robust imagepattern classification method is essential to obtain the desired accuracy. This method can be accuratelyclassify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism.Moreover, to date, most of the existing studies are focused on evaluating their methods based on specificorthogonal moments, which limits the understanding of their potential application to various DiscreteOrthogonal Moments (DOMs). The
... Show More