Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show MoreIt is believed that Organizations around the world should be prepared for the transition to IPv6 and make sure they have the " know how" to be able to succeed in choosing the right migration to start time. This paper focuses on the transition to IPv6 mechanisms. Also, this paper proposes and tests a deployment of IPv6 prototype within the intranet of the University of Baghdad (BUniv) using virtualization software. Also, it deals with security issues, improvements and extensions of IPv6 network using firewalls, Virtual Private Network ( VPN), Access list ( ACLs). Finally, the performance of the obtainable intrusion detection model is assessed and compared with three approaches.
Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreThe Research aims to investigate into reality in terms of planning and scheduling management process for sake the implementation and maintenance of irrigation and drainage projects in the Republic of Iraq, with an indication of the most important obstacles that impede the planning and scheduling management process for these projects and ways of addressing them and minimizing their effects. For the purpose of achieving the goal of the research, a sci
... Show MoreIn this paper, An application of non-additive measures for re-evaluating the degree of importance of some student failure reasons has been discussed. We apply non-additive fuzzy integral model (Sugeno, Shilkret and Choquet) integrals for some expected factors which effect student examination performance for different students' cases.
Weibull distribution is considered as one of the most widely distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.
In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se
... Show MoreCanonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreThis study aims to determine the prevalence of Entamoeba histolytica, Entamoeba dispar and
Entamoeba moshkovskii by three methods of diagnosis (microscopic examination, cultivation and PCR) that
were compared to obtain an accurate diagnosis of Entamoeba spp. during amoebiasis. Total (n=150) stool
samples related to patients were (n = 100) and healthy controls (n= 50). Clinically diagnosed stool samples
(n=100) were collected from patients attending the consultant clinics of different hospitals in Basrah during
the period from January 2018 to January 2019. The results showed that 60% of collected samples were
positive in a direct microscopic examination. All samples were cultivated on different media; the Bra