Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
The apricot plant was washed, dried, and powdered after harvesting to produce a fine powder that was used in water treatment. created an alcoholic extract from the apricot plant using ethanol, which was then analysed using GC-MS, Fourier transform infrared spectroscopy, and ultraviolet-visible spectroscopy to identify the active components. Zinc nanoparticles were created using an alcoholic extract. FTIR, UV-Vis, SEM, EDX, and TEM are used to characterize zinc nanoparticles. Using a continuous processing procedure, zinc nanoparticles with apricot extract and powder were employed to clean polluted water. Firstly, 2 g of zinc nanoparticles were used with 20 ml of polluted water, and the results were Tetra 44% and Levo 32%; after
... Show MoreRecording an Electromyogram (EMG) signal is essential for diagnostic procedures like muscle health assessment and motor neurons control. The EMG signals have been used as a source of control for powered prosthetics to support people to accomplish their activities of daily living (ADLs). This work deals with studying different types of hand grips and finding their relationship with EMG activity. Five subjects carried out four functional movements (fine pinch, tripod grip and grip with the middle and thumb finger, as well as the power grip). Hand dynamometer has been used to record the EMG activity from three muscles namely; Flexor Carpi Radialis (FCR), Flexor Digitorum Superficialis (FDS), and Abductor Pollicis Brevis (ABP) with different le
... Show MoreThe study aims to discuss the relation between imported inflation and international trade of Iraqi economy for the period (1990-2015) by using annual data. To achieve the study aim, statistical and Econometrics methods are used through NARDL model to explain non-linear relation because it’s a model assigned to measure non-linear relations and as we know most economic relations are non-linear, beside explaining positive and negative effects of imported inflation, and to reach the research aim deductive approach was adopted through using descriptive method to describe and determine phenomenon. Beside the inductive approach by g statistical and standard tools to get the standard model explains the
... Show MoreThe objective of this research is employ the special cases of function trapezoid in the composition of fuzzy sets to make decision within the framework of the theory of games traditional to determine the best strategy for the mobile phone networks in the province of Baghdad and Basra, has been the adoption of different periods of the functions belonging to see the change happening in the matrix matches and the impact that the strategies and decision-making available to each player and the impact on societ
... Show MoreA simple, fast, inexpensive and sensitive method has been proposed to screen and optimize experimental factors that effecting the determination of phenylephrine hydrochloride (PHE.HCl) in pure and pharmaceutical formulations. The method is based on the development of brown-colored charge transfer (CT) complex with p-Bromanil (p-Br) in an alkaline medium (pH=9) with 1.07 min after heating at 80 °C. ‘Design of Experiments’ (DOE) employing ‘Central Composite Face Centered Design’ (CCF) and ‘Response Surface Methodology’ (RSM) were applied as an improvement to traditional ‘One Variable at Time’ (OVAT) approach to evaluate the effects of variations in selected factors (volume of 5×10-3 M p-Br, heating time, and temperature) on
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreBackground: Insufficient sleep due to excessive media use is linked to decrease physical activity, poor nutrition, obesity, and decreased overall health-related quality of life.
Objectives: To assess the effect of using the internet and social media on the sleep of 4th-stage secondary school students.
Subjects and Methods: Cross-sectional study with the analytic element; for 500 secondary school students, obtained by choosing two schools randomly from each of the six educational directorates, by using a structured questionnaire.
Result: Secondary scho
... Show MoreThe novel Vierordt’s approach, or simultaneous equation method, was created and validated for the concurrent determination of vincristine sulfate (VCS) and bovine serum albumin (BSA) in pure solutions utilizing UV spectrophotometry. It is simple, precise, economical, rapid, reliable, and accurate. This method depends on measuring absorbance at two wavelengths, 296 nm and 278 nm, which correspond to the λmax of VCS and BSA in deionized water, respectively. The calibration curves of VCS and BSA are linear at concentration ranges of 10–60 μg/mL and 200–1600 μg/mL, with correlation coefficient values (R2) of 1 and 0.999, respectively. The limits of detection (LOD) and quantification (LO
... Show More