The cloud-users are getting impatient by experiencing the delays in loading the content of the web applications over the internet, which is usually caused by the complex latency while accessing the cloud datacenters distant from the cloud-users. It is becoming a catastrophic situation in availing the services and applications over the cloud-centric network. In cloud, workload is distributed across the multiple layers which also increases the latency. Time-sensitive Internet of Things (IoT) applications and services, usually in a cloud platform, are running over various virtual machines (VM’s) and possess high complexities while interacting. They face difficulties in the consolidations of the various applications containing heterogenetic workloads. Fog computing takes the cloud computing services to the edge-network, where computation, communication and storage are within the proximity to the end-user’s edge devices. Thus, it utilizes the maximum network bandwidth, enriches the mobility, and lowers the latency. It is a futuristic, convenient and more reliable platform to overcome the cloud computing issues. In this manuscript, we propose a Fog-based Spider Web Algorithm (FSWA), a heuristic approach which reduces the delays time (DT) and enhances the response time (RT) during the workflow among the various edge nodes across the fog network. The main purpose is to trace and locate the nearest f-node for computation and to reduce the latency across the various nodes in a network. Reduction of latency will enhance the quality of service (QoS) parameters, smooth resource distribution, and services availability. Latency can be an important factor for resource optimization issues in distributed computing environments. In comparison to the cloud computing, the latency in fog computing is much improved.
The system of work has been built , which consists of dark box with dimensions (61 cm ï‚´74 cm ï‚´120 cm) when the distance between the testing image and the light source is (120 cm) . &n
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreNowadays, the mobile communication networks have become a consistent part of our everyday life by transforming huge amount of data through communicating devices, that leads to new challenges. According to the Cisco Networking Index, more than 29.3 billion networked devices will be connected to the network during the year 2023. It is obvious that the existing infrastructures in current networks will not be able to support all the generated data due to the bandwidth limits, processing and transmission overhead. To cope with these issues, future mobile communication networks must achieve high requirements to reduce the amount of transferred data, decrease latency and computation costs. One of the essential challenging tasks in this subject
... Show MoreWith the development of communication technologies for mobile devices and electronic communications, and went to the world of e-government, e-commerce and e-banking. It became necessary to control these activities from exposure to intrusion or misuse and to provide protection to them, so it's important to design powerful and efficient systems-do-this-purpose. It this paper it has been used several varieties of algorithm selection passive immune algorithm selection passive with real values, algorithm selection with passive detectors with a radius fixed, algorithm selection with passive detectors, variable- sized intrusion detection network type misuse where the algorithm generates a set of detectors to distinguish the self-samples. Practica
... Show MoreIn this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm.
Abstract: Data mining is become very important at the present time, especially with the increase in the area of information it's became huge, so it was necessary to use data mining to contain them and using them, one of the data mining techniques are association rules here using the Pattern Growth method kind enhancer for the apriori. The pattern growth method depends on fp-tree structure, this paper presents modify of fp-tree algorithm called HFMFFP-Growth by divided dataset and for each part take most frequent item in fp-tree so final nodes for conditional tree less than the original fp-tree. And less memory space and time.