The rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. E
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MoreIllegal distribution of digital data is a common danger in the film industry, especially with the rapid spread of the Internet, where it is now possible to easily distribute pirated copies of digital video on a global scale. The Watermarking system inserts invisible signs to the video content without changing the content itself. The aim of this paper is to build an invisible video watermarking system with high imperceptibility. Firstly, the watermark is confused by using the Arnold transform and then dividing into equal, non-overlapping blocks. Each block is then embedded in a specific frame using the Discrete Wavelet Transform (DWT), where the HL band is used for this purpose. Regarding the method of selecting the host frames, the
... Show MoreThe expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of man
... Show MoreThe characteristics of sulfur nanoparticles were studied by using atomic force microscope (AFM) analysis. The atomic force microscope (AFM) measurements showed that the average size of sulfur nanoparticles synthesized using thiosulfate sodium solution through the extract of cucurbita pepo extra was 93.62 nm. Protecting galvanized steel from corrosion in salt media was achieved by using sulfur nanoparticles in different temperatures. The obtained data of thermodynamic in the presence of sulfur nanoparticles referred to high value as compares to counterpart in the absence of sulfur nanoparticles, the high inhibition efficiency (%IE) and corrosion resistance were at high temperature, the corrosion rate or weig
... Show More