Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
This study utilizes streamline simulation to model fluid flow in the complex subsurface environment of the Mishrif reservoir in Iraq's Buzurgan oil field. The reservoir faces challenges from high-pressure depletion and a substantial increase in water cut during production, prompting the need for innovative reservoir management. The primary focus is on optimizing water injection procedures to reduce water cuts and enhance overall reservoir performance. Three waterflooding tactics were examined: normal conditions without injectors or producers, normal conditions with 30 injectors and 80 producers and streamline simulation using the frontsim simulator. Three main strategies were employed to streamline water injection in targeted areas.
... Show MoreRecently, Human Activity Recognition (HAR) has been a popular research field due to wide spread of sensor devices. Embedded sensors in smartwatch and smartphone enabled applications to use sensors in activity recognition with challenges for example, support of elderly’s daily life . In the aim of recognizing and analyzing human activity many approaches have been implemented in researches. Most articles published on human activity recognition used a multi -sensors based methods where a number of sensors were tied on different positions on a human body which are not suitable for many users. Currently, a smartphone and smart watch device combine different types of sensors which present a new area for analysi
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreThe reserve estimation process is continuous during the life of the field due to risk and inaccuracy that are considered an endemic problem thereby must be studied. Furthermore, the truth and properly defined hydrocarbon content can be identified just only at the field depletion. As a result, reserve estimation challenge is a function of time and available data. Reserve estimation can be divided into five types: analogy, volumetric, decline curve analysis, material balance and reservoir simulation, each of them differs from another to the kind of data required. The choice of the suitable and appropriate method relies on reservoir maturity, heterogeneity in the reservoir and data acquisition required. In this research, three types of rese
... Show MoreIn this paper, An application of non-additive measures for re-evaluating the degree of importance of some student failure reasons has been discussed. We apply non-additive fuzzy integral model (Sugeno, Shilkret and Choquet) integrals for some expected factors which effect student examination performance for different students' cases.
Weibull distribution is considered as one of the most widely distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.
In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se
... Show MoreIn this study, four different spectrophotometric methods were applied for determination of cimetidine and erythromycin ethylsuccinate drugs in pure form and in their pharmaceutical preparations. The suggested methods are simple, sensitive, accurate, not time consuming and inexpensive. The results showed the following: The first method: Based on the formation of ion pair complex of each drug with bromothymol blue (BTB) as a chromogenic reagent. The formed complexes were extracted with chloroform and their absorbance values were measured at 427.5 nm for cimetidine and 416.5nm for erythromycin ethylsuccinate; against their reagents blanks. Two different methods, univariate method and multivariate method, were used to obtain the optimum condit
... Show MoreFour rapid, accurate and very simple derivative spectrophotometric techniques were developed for the quantitative determination of binary mixtures of estradiol (E2) and progesterone (PRG) formulated as a capsule. Method I is the first derivative zero-crossing technique, derivative amplitudes were detected at the zero-crossing wavelength of 239.27 and 292.51 nm for the quantification of estradiol and 249.19 nm for Progesterone. Method II is ratio subtraction, progesterone was determined at λmax 240 nm after subtraction of interference exerted by estradiol. Method III is modified amplitude subtraction, which was established using derivative spectroscopy and mathematical manipulations. Method IIII is the absorbance ratio technique, absorba
... Show MoreThe Polarographic study of the interaction between ascorbic acid and Ni+2 was carried out at dropping mercury electrode [DME] . This study included the determination of the kinetic parameters (kfh ,n) and thermodynamic parameters such as enthalpy change (H), free energy change (G) and entropy change (S) of Ni+2 complexes with ascorbic acid in 0.1 M KCl solution over the temperature rang of (294-309)K. The electrode processes were irreversible and diffusion controlled.