With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
Watermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreThe problem motivation of this work deals with how to control the network overhead and reduce the network latency that may cause many unwanted loops resulting from using standard routing. This work proposes three different wireless routing protocols which they are originally using some advantages for famous wireless ad-hoc routing protocols such as dynamic source routing (DSR), optimized link state routing (OLSR), destination sequenced distance vector (DSDV) and zone routing protocol (ZRP). The first proposed routing protocol is presented an enhanced destination sequenced distance vector (E-DSDV) routing protocol, while the second proposed routing protocol is designed based on using the advantages of DSDV and ZRP and we named it as
... Show MoreWireless lietworking is· constantly improving, changing and
though ba ic principle is the same. ['nstead of using standard cables to transmit information fmm one point to another (qr more), it .uses radio signals. This paper presents .a case study considedng real-time remote
cqntroJ using Wireless UDP/JP-based networks,. The aim of-this werk is to
reduce real-time· remote control system based upon a simulatio.n model,
which can operate via general communication l"]etworks, whieh on bodies. modern wireles tcchnolqgy.
The first part includes· a brief
... Show MoreVoice over Internet Protocol (VoIP) is important technology that’s rapidly growing in the wireless networks. The Quality of Service (QoS) and Capacity are two of the most important issues that still need to be researched on wireless VoIP. The main aim of this paper is to analysis the performance of the VoIP application in wireless networks, with respect to different transport layer protocols and audio codec. Two scenarios used in the simulation stage. In the first scenario VoIP with codec G.711 transmitted over User Datagram Protocol (UDP), Stream Control Transmission Protocol (SCTP), and Real-Time Transport Protocol (RTP). While, in the second scenario VoIP with codec G.726 transmitted over UDP, SCTP, and RTP protocols. Network simulator
... Show MoreIn this paper, we studied the scheduling of jobs on a single machine. Each of n jobs is to be processed without interruption and becomes available for processing at time zero. The objective is to find a processing order of the jobs, minimizing the sum of maximum earliness and maximum tardiness. This problem is to minimize the earliness and tardiness values, so this model is equivalent to the just-in-time production system. Our lower bound depended on the decomposition of the problem into two subprograms. We presented a novel heuristic approach to find a near-optimal solution for the problem. This approach depends on finding efficient solutions for two problems. The first problem is minimizing total completi
... Show MoreIn this paper, a Modified Weighted Low Energy Adaptive Clustering Hierarchy (MW-LEACH) protocol is implemented to improve the Quality of Service (QoS) in Wireless Sensor Network (WSN) with mobile sink node. The Quality of Service is measured in terms of Throughput Ratio (TR), Packet Loss Ratio (PLR) and Energy Consumption (EC). The protocol is implemented based on Python simulation. Simulation Results showed that the proposed protocol provides better Quality of Service in comparison with Weighted Low Energy Cluster Hierarchy (W-LEACH) protocol by 63%.
The rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
Scheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show More