Preferred Language
Articles
/
ijs-1810
Fog-based Spider Web Algorithm to Overcome Latency in Cloud Computing
...Show More Authors

The cloud-users are getting impatient by experiencing the delays in loading the content of the web applications over the internet, which is usually caused by the complex latency while accessing the cloud datacenters distant from the cloud-users. It is becoming a catastrophic situation in availing the services and applications over the cloud-centric network. In cloud, workload is distributed across the multiple layers which also increases the latency. Time-sensitive Internet of Things (IoT) applications and services, usually in a cloud platform, are running over various virtual machines (VM’s) and possess high complexities while interacting. They face difficulties in the consolidations of the various applications containing heterogenetic workloads. Fog computing takes the cloud computing services to the edge-network, where computation, communication and storage are within the proximity to the end-user’s edge devices. Thus, it utilizes the maximum network bandwidth, enriches the mobility, and lowers the latency. It is a futuristic, convenient and more reliable platform to overcome the cloud computing issues. In this manuscript, we propose a Fog-based Spider Web Algorithm (FSWA), a heuristic approach which reduces the delays time (DT) and enhances the response time (RT) during the workflow among the various edge nodes across the fog network. The main purpose is to trace and locate the nearest f-node for computation and to reduce the latency across the various nodes in a network. Reduction of latency will enhance the quality of service (QoS) parameters, smooth resource distribution, and services availability. Latency can be an important factor for resource optimization issues in distributed computing environments. In comparison to the cloud computing, the latency in fog computing is much improved.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Jan 01 2018
Journal Name
Pakistan Entomologist
Taxonomic and molecular study of the widow spider genus Latrodectus Walckenaer, 1805 (Araneae: Theridiidae) in Iraq
...Show More Authors

The genus Latrodectus Walckenaer, 1805 (Araneae: Theridiidae) is a worldwide distribute genus (Graudins et al., 2001), it includes a group of species commonly referred to as widow spiders. It's considered a taxonomically complex genus as the status of several forms had not been properly evaluated and specific boundaries are not well defined or understood (Levi, 1959; 1967; Garb et al., 2001), therefore, in multiple cases, populations has been uncritically referred to as different taxa. Discriminating between Latrodectus species using morphology has always been problematic (Levi, 1983), it is difficult taxonomically and readily separated from members of other Theridiid genera (Mirshamsi, 2005). The Genus Asagena Sundevall, 1833 was revalidat

... Show More
Publication Date
Thu Nov 17 2016
Journal Name
Plos One
Efficient and Stable Routing Algorithm Based on User Mobility and Node Density in Urban Vehicular Network
...Show More Authors

Vehicular ad hoc networks (VANETs) are considered an emerging technology in the industrial and educational fields. This technology is essential in the deployment of the intelligent transportation system, which is targeted to improve safety and efficiency of traffic. The implementation of VANETs can be effectively executed by transmitting data among vehicles with the use of multiple hops. However, the intrinsic characteristics of VANETs, such as its dynamic network topology and intermittent connectivity, limit data delivery. One particular challenge of this network is the possibility that the contributing node may only remain in the network for a limited time. Hence, to prevent data loss from that node, the information must reach the destina

... Show More
View Publication Preview PDF
Scopus (32)
Crossref (28)
Scopus Clarivate Crossref
Publication Date
Tue Nov 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Implementation of new Secure Mechanism for Data Deduplication in Hybrid Cloud
...Show More Authors

Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of

... Show More
View Publication Preview PDF
Publication Date
Wed Apr 01 2015
Journal Name
Ibn Al-haitham Journal For Pure And Applied Science
Security For Three -Tired Web Application
...Show More Authors

Web application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensure that it cannot be stolen, verification of input before it is s

... Show More
Preview PDF
Publication Date
Sun Mar 19 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Security For Three-Tiered Web Application
...Show More Authors

          Web application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research).  This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensur

... Show More
View Publication Preview PDF
Publication Date
Sat Jun 12 2021
Journal Name
2021 International Conference On Electrical, Communication, And Computer Engineering (icecce)
Efficient Private Cloud Resources Platform
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Tue Sep 10 2019
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
A classification model on tumor cancer disease based mutual information and firefly algorithm
...Show More Authors

View Publication
Scopus (14)
Crossref (5)
Scopus Crossref
Publication Date
Sun Jan 30 2022
Journal Name
Iraqi Journal Of Science
Modified Blowfish Algorithm for Image Encryption using Multi Keys based on five Sboxes
...Show More Authors

In this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox

... Show More
View Publication Preview PDF
Publication Date
Mon Jan 01 2024
Journal Name
Aip Conference Proceedings
Modeling and analysis of thermal contrast based on LST algorithm for Baghdad city
...Show More Authors

View Publication
Scopus Crossref
Publication Date
Mon Dec 01 2014
Journal Name
2014 Ieee Symposium On Differential Evolution (sde)
Comparative analysis of a modified differential evolution algorithm based on bacterial mutation scheme
...Show More Authors

A new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref