The cloud-users are getting impatient by experiencing the delays in loading the content of the web applications over the internet, which is usually caused by the complex latency while accessing the cloud datacenters distant from the cloud-users. It is becoming a catastrophic situation in availing the services and applications over the cloud-centric network. In cloud, workload is distributed across the multiple layers which also increases the latency. Time-sensitive Internet of Things (IoT) applications and services, usually in a cloud platform, are running over various virtual machines (VM’s) and possess high complexities while interacting. They face difficulties in the consolidations of the various applications containing heterogenetic workloads. Fog computing takes the cloud computing services to the edge-network, where computation, communication and storage are within the proximity to the end-user’s edge devices. Thus, it utilizes the maximum network bandwidth, enriches the mobility, and lowers the latency. It is a futuristic, convenient and more reliable platform to overcome the cloud computing issues. In this manuscript, we propose a Fog-based Spider Web Algorithm (FSWA), a heuristic approach which reduces the delays time (DT) and enhances the response time (RT) during the workflow among the various edge nodes across the fog network. The main purpose is to trace and locate the nearest f-node for computation and to reduce the latency across the various nodes in a network. Reduction of latency will enhance the quality of service (QoS) parameters, smooth resource distribution, and services availability. Latency can be an important factor for resource optimization issues in distributed computing environments. In comparison to the cloud computing, the latency in fog computing is much improved.
The genus Latrodectus Walckenaer, 1805 (Araneae: Theridiidae) is a worldwide distribute genus (Graudins et al., 2001), it includes a group of species commonly referred to as widow spiders. It's considered a taxonomically complex genus as the status of several forms had not been properly evaluated and specific boundaries are not well defined or understood (Levi, 1959; 1967; Garb et al., 2001), therefore, in multiple cases, populations has been uncritically referred to as different taxa. Discriminating between Latrodectus species using morphology has always been problematic (Levi, 1983), it is difficult taxonomically and readily separated from members of other Theridiid genera (Mirshamsi, 2005). The Genus Asagena Sundevall, 1833 was revalidat
... Show MoreVehicular ad hoc networks (VANETs) are considered an emerging technology in the industrial and educational fields. This technology is essential in the deployment of the intelligent transportation system, which is targeted to improve safety and efficiency of traffic. The implementation of VANETs can be effectively executed by transmitting data among vehicles with the use of multiple hops. However, the intrinsic characteristics of VANETs, such as its dynamic network topology and intermittent connectivity, limit data delivery. One particular challenge of this network is the possibility that the contributing node may only remain in the network for a limited time. Hence, to prevent data loss from that node, the information must reach the destina
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreWeb application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensure that it cannot be stolen, verification of input before it is s
... Show MoreWeb application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensur
... Show MoreIn this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox
... Show MoreA new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show More