Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
The permeability is the most important parameter that indicates how efficient the reservoir fluids flow through the rock pores to the wellbore. Well-log evaluation and core measurements techniques are typically used to estimate it. In this paper, the permeability has been predicted by using classical and Flow zone indicator methods. A comparison between the two methods shows the superiority of the FZI method correlations, these correlations can be used to estimate permeability in un-cored wells with a good approximation.
A theoretical study on corrosion inhibitors was done by quantum calculations includes semi-empirical PM3 and Density Functional Theory (DFT) methods based on B3LYP/6311++G (2d,2P). Benzimidazole derivative (oxo(4- ((phenylcarbamothioyl) carbamoyl)phenyl) ammonio) oxonium (4NBP) and thiourea derivative 2-((4- bromobenzyl)thio) -1H-benzo[d] imidazole (2SB) were used as corrosion inhibitors and an essential quantum chemical parameters correlated with inhibition efficiency, EHOMO (highest occupied molecular orbital energy) and ELUMO (lowest molecular orbital energy). Other parameters are also studied like energy gap [ΔE (HOMO-LUMO)], electron affinity (EA), hardness (Δ), dipole moment (μ), softness (S), ionization potential (IE), absolut
... Show MoreIn this paper, method of steganography in Audio is introduced for hiding secret data in audio media file (WAV). Hiding in audio becomes a challenging discipline, since the Human Auditory System is extremely sensitive. The proposed method is to embed the secret text message in frequency domain of audio file. The proposed method contained two stages: the first embedding phase and the second extraction phase. In embedding phase the audio file transformed from time domain to frequency domain using 1-level linear wavelet decomposition technique and only high frequency is used for hiding secreted message. The text message encrypted using Data Encryption Standard (DES) algorithm. Finally; the Least Significant bit (LSB) algorithm used to hide secr
... Show MoreThis study focuses on evaluating the suitability of three interpolation methods in terms of their accuracy at climate data for some provinces of south of Iraq. Two data sets of maximum and minimum temperature in February 2008 from nine meteorological stations located in the south of Iraq using three interpolation methods. ArcGIS is used to produce the spatially distributed temperature data by using IDW, ordinary kriging, and spline. Four statistical methods are applied to analyze the results obtained from three interpolation methods. These methods are RMSE, RMSE as a percentage of the mean, Model efficiency (E) and Bias, which showed that the ordinary krigingis the best for this data from other methods by the results that have b
... Show MoreMeasuring the level of communicative competence in news headlines and the level of stylistic and semantic processing in its formulation requires creating a quantitative scale based on the bases on building the scales and their standards. As judging by scientific of journalism studies lies in the possibility of quantifying the journalistic knowledge, i.e. the ability of this knowledge to shift from qualitative language to its equivalent in the language of numbers.
News headlines and editorial processing are one of the journalistic knowledges that should be studied, analyzed stylistically and semantically; their conclusions drawn and expressed in numbers. Press knowledge is divided into two types:<
... Show MoreThis research includes the application of non-parametric methods in estimating the conditional survival function represented in a method (Turnbull) and (Generalization Turnbull's) using data for Interval censored of breast cancer and two types of treatment, Chemotherapy and radiation therapy and age is continuous variable, The algorithm of estimators was applied through using (MATLAB) and then the use average Mean Square Error (MSE) as amusement to the estimates and the results showed (generalization of Turnbull's) In estimating the conditional survival function and for both treatments ,The estimated survival of the patients does not show very large differences
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreIn this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show More