The cloud-users are getting impatient by experiencing the delays in loading the content of the web applications over the internet, which is usually caused by the complex latency while accessing the cloud datacenters distant from the cloud-users. It is becoming a catastrophic situation in availing the services and applications over the cloud-centric network. In cloud, workload is distributed across the multiple layers which also increases the latency. Time-sensitive Internet of Things (IoT) applications and services, usually in a cloud platform, are running over various virtual machines (VM’s) and possess high complexities while interacting. They face difficulties in the consolidations of the various applications containing heterogenetic workloads. Fog computing takes the cloud computing services to the edge-network, where computation, communication and storage are within the proximity to the end-user’s edge devices. Thus, it utilizes the maximum network bandwidth, enriches the mobility, and lowers the latency. It is a futuristic, convenient and more reliable platform to overcome the cloud computing issues. In this manuscript, we propose a Fog-based Spider Web Algorithm (FSWA), a heuristic approach which reduces the delays time (DT) and enhances the response time (RT) during the workflow among the various edge nodes across the fog network. The main purpose is to trace and locate the nearest f-node for computation and to reduce the latency across the various nodes in a network. Reduction of latency will enhance the quality of service (QoS) parameters, smooth resource distribution, and services availability. Latency can be an important factor for resource optimization issues in distributed computing environments. In comparison to the cloud computing, the latency in fog computing is much improved.
The research aims to identify the impact of using the electronic participatory learning strategy according to internet programs in learning some basic basketball skills for middle first graders according to the curricular course, and the sample of research was selected in the deliberate way of students The first stage of intermediate school.As for the problem of research, the researchers said that there is a weakness in the levels of school students in terms of teaching basketball skills, which prompted the researchers to create appropriate solutions by using a participatory learning strategy.The researchers imposed statistically significant differences between pre and post-test tests, in favor of the post tests individually and in favor of
... Show MoreIn this study, simple, low cost, precise and speed spectrophotometric methods development for evaluation of sulfacetamide sodium are described. The primary approach contains conversion of sulfacetamide sodium to diazonium salt followed by a reaction with p-cresol as a reagent in the alkaline media. The colored product has an orange colour with absorbance at λmax 450 nm. At the concentration range of (5.0-100 µg.mL-1), the Beer̆ s Low is obeyed with correlation coefficient (R2= 0.9996), limit of detection as 0.2142 µg.mL-1, limit of quantification as 0.707 µg.mL-1 and molar absorptivity as 1488.249 L.mol-1.cm-1. The other approach, cloud point extraction w
... Show MoreThe rapid changes in the field of transferring and exchanging information via cloud platforms have revolutionized the field of modern visual media, as cloud computing technology has greatly influenced the media institutions, providing effort, money and high-quality materials. The research included five chapters, the first came under the methodological framework for the research and the second Theoretical framework, the first two included the concept of cloud computing and the second platforms for cloud computing in the visual media and the third chapter Research procedures and the fourth chapter The sample analysis and the fifth chapter The research results were the most prominent
1. The cloud service made the benefit beyond the typic
Due to the development that occurs in the technologies of information system many techniques was introduced and played important role in the connection between machines and peoples through internet, also it used to control and monitor of machines, these technologies called cloud computing and Internet of Things. With the replacement of computing resources with manufacturing resources cloud computing named converted into cloud manufacturing.
In this research cloud computing was used in the field of manufacturing to automate the process of selecting G-Code that Computer Numerical Control machine work it, this process was applied by the using of this machine with Radio Frequency Identification and a AWS Cloud services and some of py
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show MoreTwo simple, rapid, and useful spectrophotometric methods were suggest or the determination of sulphadimidine sodium (SDMS) with and without using cloud point extraction technique in pure form and pharmaceutical preparation. The first method was based on diazotization of the Sulphdimidine Sodium drug by sodium nitrite at 5 ºC, followed by coupling with α –Naphthol in basic medium to form an orange colored product . The product was stabilized and its absorption was measured at 473 nm. Beer’s law was obeyed in the concentration range of (1-12) μg∙ml-1. Sandell’s sensitivity was 0.03012 μg∙cm-1, the detection limit was 0.0277 μg∙ml-1, and the limit of Quantitation was 0.03605μg
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show More