Nowadays, cloud computing has attracted the attention of large companies due to its high potential, flexibility, and profitability in providing multi-sources of hardware and software to serve the connected users. Given the scale of modern data centers and the dynamic nature of their resource provisioning, we need effective scheduling techniques to manage these resources while satisfying both the cloud providers and cloud users goals. Task scheduling in cloud computing is considered as NP-hard problem which cannot be easily solved by classical optimization methods. Thus, both heuristic and meta-heuristic techniques have been utilized to provide optimal or near-optimal solutions within an acceptable time frame for such problems. In this article, a summary of heuristic and meta-heuristic methods for solving the task scheduling optimization in cloud-fog systems is presented. The cost and time aware scheduling methods for both bag of tasks and workflow tasks are reviewed, discussed, and analyzed thoroughly to provide a clear vision for the readers in order to select the proper methods which fulfill their needs.
One of the main causes for concern is the widespread presence of pharmaceuticals in the environment, which may be harmful to living things. They are often referred to as emerging chemical pollutants in water bodies because they are either still unregulated or undergoing regulation. Pharmaceutical pollution of the environment may have detrimental effects on ecosystem viability, human health, and water quality. In this study, the amount of remaining pharmaceutical compounds in environmental waters was determined using a straightforward review. Pharmaceutical production and consumption have increased due to medical advancements, leading to concerns about their environmental impact and potential harm to living things due to their increa
... Show MoreBlockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show MoreLarge amounts of plasma, the universe’s fourth most common kind of stuff, may be found across our galaxy and other galaxies. There are four types of matter in the cosmos, and plasma is the most common. By heating the compressed air or inert gases to create negatively and positively charged particles known as ions, electrically neutral particles in their natural state are formed. Many scientists are currently focusing their efforts on the development of artificial plasma and the possible advantages it may have for humankind in the near future. In the literature, there is a scarcity of information regarding plasma applications. It’s the goal of this page to describe particular methods for creating and using plasma, which may be us
... Show MoreEarly detection of brain tumors is critical for enhancing treatment options and extending patient survival. Magnetic resonance imaging (MRI) scanning gives more detailed information, such as greater contrast and clarity than any other scanning method. Manually dividing brain tumors from many MRI images collected in clinical practice for cancer diagnosis is a tough and time-consuming task. Tumors and MRI scans of the brain can be discovered using algorithms and machine learning technologies, making the process easier for doctors because MRI images can appear healthy when the person may have a tumor or be malignant. Recently, deep learning techniques based on deep convolutional neural networks have been used to analyze med
... Show MoreCloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreAbstract
In this work, two algorithms of Metaheuristic algorithms were hybridized. The first is Invasive Weed Optimization algorithm (IWO) it is a numerical stochastic optimization algorithm and the second is Whale Optimization Algorithm (WOA) it is an algorithm based on the intelligence of swarms and community intelligence. Invasive Weed Optimization Algorithm (IWO) is an algorithm inspired by nature and specifically from the colonizing weeds behavior of weeds, first proposed in 2006 by Mehrabian and Lucas. Due to their strength and adaptability, weeds pose a serious threat to cultivated plants, making them a threat to the cultivation process. The behavior of these weeds has been simulated and used in Invas
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreCloud computing offers a new way of service provision by rearranging various resources over the Internet. The most important and popular cloud service is data storage. In order to preserve the privacy of data holders, data are often stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for big data storage and processing in the cloud. Traditional deduplication schemes cannot work on encrypted data. Among these data, digital videos are fairly huge in terms of storage cost and size; and techniques that can help the legal aspects of video owner such as copyright protection and reducing the cloud storage cost and size are always desired. This paper focuses on v
... Show More