In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show MoreWith the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
The paper proposes a methodology for predicting packet flow at the data plane in smart SDN based on the intelligent controller of spike neural networks(SNN). This methodology is applied to predict the subsequent step of the packet flow, consequently reducing the overcrowding that might happen. The centralized controller acts as a reactive controller for managing the clustering head process in the Software Defined Network data layer in the proposed model. The simulation results show the capability of Spike Neural Network controller in SDN control layer to improve the (QoS) in the whole network in terms of minimizing the packet loss ratio and increased the buffer utilization ratio.
Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreDue to the huge variety of 5G services, Network slicing is promising mechanism for dividing the physical network resources in to multiple logical network slices according to the requirements of each user. Highly accurate and fast traffic classification algorithm is required to ensure better Quality of Service (QoS) and effective network slicing. Fine-grained resource allocation can be realized by Software Defined Networking (SDN) with centralized controlling of network resources. However, the relevant research activities have concentrated on the deep learning systems which consume enormous computation and storage requirements of SDN controller that results in limitations of speed and accuracy of traffic classification mechanism. To fill thi
... Show MoreSoftware Defined Networking (SDN) with centralized control provides a global view and achieves efficient network resources management. However, using centralized controllers has several limitations related to scalability and performance, especially with the exponential growth of 5G communication. This paper proposes a novel traffic scheduling algorithm to avoid congestion in the control plane. The Packet-In messages received from different 5G devices are classified into two classes: critical and non-critical 5G communication by adopting Dual-Spike Neural Networks (DSNN) classifier and implementing it on a Virtualized Network Function (VNF). Dual spikes identify each class to increase the reliability of the classification
... Show MoreFiber-to-the-Home (FTTH) has long been recognized as a technology that provides future proof bandwidth [1], but has generally been too expensive to implement on a wide scale. However, reductions in the cost of electro-optic components and improvements in the handling of fiber optics now make FTTH a cost effective solution in many situations. The transition to FTTH in the access network is also a benefit for both consumers and service providers because it opens up the near limitless capacity of the core long-haul network to the local user. In this paper individual passive optical components, transceivers, and fibers has been put together to form a complete FTTH network. Then the implementation of the under construction Baghdad/Al
... Show More
In today's world, most business, regardless of size, believe that access to Internet is imperative if they are going to complete effectively. Yet connecting a private computer (or a network) to the Internet can expose critical or confidential data to malicious attack from anywhere in the world since unprotected connections to the Internet (or any network topology) leaves the user computer vulnerable to hacker attacks and other Internet threats. Therefore, to provide high degree of protection to the network and network's user, Firewall need to be used.
Firewall provides a barrier between the user computer and the Internet (i.e. it prevents unauthor
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreNumerous regions in the city of Baghdad experience the congestion and traffic problems. Due to the religious and economic significance, Al-Kadhimiya city (inside the metropolitan range of Baghdad) was chosen as study area. The data gathering stage was separated into two branches: the questionnaire method which is utilized to estimate the traffic volumes for the chosen roads and field data collection method which included video recording and manual counting for the volumes entering the selected signal intersections. The stage of analysis and evaluation for the seventeen urban roads, one highway, and three intersections was performed by HCS-2000 software.The presented work plots a system for assessing the level of service
... Show More