The rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. Existing research used metaheuristic algorithm to solve tak scheduling problem, however, must of the existing metaheuristics used suffers from falling into local mina due to their inefficiency to avoid unfeasible region in the solution search space. Therefore, there is a dire need for an efficient metaheuristic algorithm for task scheduling. This study proposed an FPA-ISFLA task scheduling model using hybrid flower pollination and improved shuffled frog leaping algorithms. The simulation results indicate that the FPA-ISFLA algorithm is superior to the PSO algorithm in terms of makespan time, resource utilization, and execution cost reduction, especially with an increasing number of tasks.
A study of the effects of the discharge (sputtering) currents (60-75 mA) and the thickness of copper target (0.037, 0.055 and 0.085 mm) on the prepared samples was performed. These samples were deposited with pure copper on a glass substrate using dc magnetron sputtering with a magnetic flux density of 150 gauss at the center. The effects of these two parameters were studied on the height, diameter, and size of the deposition copper grains as well as the roughness of surface samples using atomic force microscopy (AFM).The results of this study showed that it is possible to control the specifications of copper grains by changing the discharge currents and the thickness of the target material. The increase in discharge curre
... Show MoreA biconical antenna has been developed for ultra-wideband sensing. A wide impedance bandwidth of around 115% at bandwidth 3.73-14 GHz is achieved which shows that the proposed antenna exhibits a fairly sensitive sensor for microwave medical imaging applications. The sensor and instrumentation is used together with an improved version of delay and sum image reconstruction algorithm on both fatty and glandular breast phantoms. The relatively new imaging set-up provides robust reconstruction of complex permittivity profiles especially in glandular phantoms, producing results that are well matched to the geometries and composition of the tissues. Respectively, the signal-to-clutter and the signal-to-mean ratios of the improved method are consis
... Show MoreThe current study aims to apply the methods of evaluating investment decisions to extract the highest value and reduce the economic and environmental costs of the health sector according to the strategy.In order to achieve the objectives of the study, the researcher relied on the deductive approach in the theoretical aspect by collecting sources and previous studies. He also used the applied practical approach, relying on the data and reports of Amir almuminin Hospital for the period (2017-2031) for the purpose of evaluating investment decisions in the hospital. A set of conclusions, the most important of which is: The failure to apply
... Show MoreContinuous turbidimetric analysis (CTA) for a distinctive analytical application by employing a homemade analyser (NAG Dual & Solo 0-180°) which contained two consecutive detection zones (measuring cells 1 & 2) is described. The analyser works based on light-emitting diodes as a light source and a set of solar cells as a light detector for turbidity measurements without needing further fibres or lenses. Formation of a turbid precipitated product with yellow colour due to the reaction between the warfarin and the precipitation reagent (Potassium dichromate) is what the developed method is based on. The CTA method was applied to determine the warfarin in pure form and pharmaceu
Continuous turbidimetric analysis (CTA) for a distinctive analytical application by employing a homemade analyser (NAG Dual & Solo 0-180°) which contained two consecutive detection zones (measuring cells 1 & 2) is described. The analyser works based on light-emitting diodes as a light source and a set of solar cells as a light detector for turbidity measurements without needing further fibres or lenses. Formation of a turbid precipitated product with yellow colour due to the reaction between the warfarin and the precipitation reagent (Potassium dichromate) is what the developed method is based on. The CTA method was applied to determine the warfarin in pure form and pharmaceu
Evolutionary algorithms are better than heuristic algorithms at finding protein complexes in protein-protein interaction networks (PPINs). Many of these algorithms depend on their standard frameworks, which are based on topology. Further, many of these algorithms have been exclusively examined on networks with only reliable interaction data. The main objective of this paper is to extend the design of the canonical and topological-based evolutionary algorithms suggested in the literature to cope with noisy PPINs. The design of the evolutionary algorithm is extended based on the functional domain of the proteins rather than on the topological domain of the PPIN. The gene ontology annotation in each molecular function, biological proce
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show More