The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presented in real time which is minimizing the cost that represents the important factor to successful management of networks that was achieved using agents.
The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreIn this paper, we show that for the alternating group An, the class C of n- cycle, CC covers An for n when n = 4k + 1 > 5 and odd. This class splits into two classes of An denoted by C and C/, CC= C/C/ was found.
This paper presents a method to classify colored textural images of skin tissues. Since medical images havehighly heterogeneity, the development of reliable skin-cancer detection process is difficult, and a mono fractaldimension is not sufficient to classify images of this nature. A multifractal-based feature vectors are suggested hereas an alternative and more effective tool. At the same time multiple color channels are used to get more descriptivefeatures.Two multifractal based set of features are suggested here. The first set measures the local roughness property, whilethe second set measure the local contrast property.A combination of all the extracted features from the three colormodels gives a highest classification accuracy with 99.4
... Show MoreThe research aims to identify the educational plan for the private and public kindergartens. The researchers selected a sample consisted of (59) female teachers for the private kindergartens and (150) female teachers for the public kindergartens in the city of Baghdad. As for the research tool, the two researchers designed a questionnaire to measure the educational plan for the private and public kindergartens. The results revealed that private kindergartens have educational plans that contribute considerably to classroom interaction, the public kindergartens lack for educational plans. In light of the findings of the research, the researchers recommend the following: the need to set up a unified educational plan for the priv
... Show MoreThe aim of this paper is to propose an efficient three steps iterative method for finding the zeros of the nonlinear equation f(x)=0 . Starting with a suitably chosen , the method generates a sequence of iterates converging to the root. The convergence analysis is proved to establish its five order of convergence. Several examples are given to illustrate the efficiency of the proposed new method and its comparison with other methods.
Deepfake is a type of artificial intelligence used to create convincing images, audio, and video hoaxes and it concerns celebrities and everyone because they are easy to manufacture. Deepfake are hard to recognize by people and current approaches, especially high-quality ones. As a defense against Deepfake techniques, various methods to detect Deepfake in images have been suggested. Most of them had limitations, like only working with one face in an image. The face has to be facing forward, with both eyes and the mouth open, depending on what part of the face they worked on. Other than that, a few focus on the impact of pre-processing steps on the detection accuracy of the models. This paper introduces a framework design focused on this asp
... Show MoreGlobally, Sustainability is very quickly becoming a fundamental requirement of the construction industry as it delivers its projects; whether buildings or infrastructures. Throughout more than two decades, many modeling schemes, evaluation tools, and rating systems have been introduced en route to realizing sustainable construction. Many of these, however, lack consensus on evaluation criteria, a robust scientific model that captures the logic behind their sustainability performance evaluation, and therefore experience discrepancies between rated results and actual performance. Moreover, very few of the evaluation tools available satisfactorily address infrastructure projects. The res
