Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
Canonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreThis work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show MoreIt is believed that Organizations around the world should be prepared for the transition to IPv6 and make sure they have the " know how" to be able to succeed in choosing the right migration to start time. This paper focuses on the transition to IPv6 mechanisms. Also, this paper proposes and tests a deployment of IPv6 prototype within the intranet of the University of Baghdad (BUniv) using virtualization software. Also, it deals with security issues, improvements and extensions of IPv6 network using firewalls, Virtual Private Network ( VPN), Access list ( ACLs). Finally, the performance of the obtainable intrusion detection model is assessed and compared with three approaches.
Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show More
In this study, four different spectrophotometric methods were applied for determination of cimetidine and erythromycin ethylsuccinate drugs in pure form and in their pharmaceutical preparations. The suggested methods are simple, sensitive, accurate, not time consuming and inexpensive. The results showed the following: The first method: Based on the formation of ion pair complex of each drug with bromothymol blue (BTB) as a chromogenic reagent. The formed complexes were extracted with chloroform and their absorbance values were measured at 427.5 nm for cimetidine and 416.5nm for erythromycin ethylsuccinate; against their reagents blanks. Two different methods, univariate method and multivariate method, were used to obtain the optimum condit
... Show MoreFour rapid, accurate and very simple derivative spectrophotometric techniques were developed for the quantitative determination of binary mixtures of estradiol (E2) and progesterone (PRG) formulated as a capsule. Method I is the first derivative zero-crossing technique, derivative amplitudes were detected at the zero-crossing wavelength of 239.27 and 292.51 nm for the quantification of estradiol and 249.19 nm for Progesterone. Method II is ratio subtraction, progesterone was determined at λmax 240 nm after subtraction of interference exerted by estradiol. Method III is modified amplitude subtraction, which was established using derivative spectroscopy and mathematical manipulations. Method IIII is the absorbance ratio technique, absorba
... Show MoreThe transfer function model the basic concepts in the time series. This model is used in the case of multivariate time series. As for the design of this model, it depends on the available data in the time series and other information in the series so when the representation of the transfer function model depends on the representation of the data In this research, the transfer function has been estimated using the style nonparametric represented in two method local linear regression and cubic smoothing spline method The method of semi-parametric represented use semiparametric single index model, With four proposals, , That the goal of this research is comparing the capabilities of the above mentioned m
... Show More