Software-Defined Networking (SDN) has evolved network management by detaching the control plane from the data forwarding plane, resulting in unparalleled flexibility and efficiency in network administration. However, the heterogeneity of traffic in SDN presents issues in achieving Quality of Service (QoS) demands and efficiently managing network resources. SDN traffic flows are often divided into elephant flows (EFs) and mice flows (MFs). EFs, which are distinguished by their huge packet sizes and long durations, account for a small amount of total traffic but require disproportionate network resources, thus causing congestion and delays for smaller MFs. MFs, on the other hand, have a short lifetime and are latency-sensitive, but they account for the vast bulk of traffic in data center networks. The incorrect use of network resources by EFs frequently disturbs the performance of MFs. To meet these issues, precise classification of network traffic has become crucial. This classification enables traffic-aware routing techniques. This paper offers a novel model for classifying SDN traffic into MF and EF using a spike neural network. Once identified, traffic is routed based on the classification results. For MF, the model uses the Dijkstra algorithm. For EF, the Widest Dijkstra algorithm is used. This model solves the difficulties of traffic heterogeneity in SDNs by integrating advanced classification techniques and strategic routing algorithms. It enables desirable resource allocation, eliminates congestion, and increases network performance and dependability. The models used have proven their efficiency by outperforming the traditional Software Defined Network and other algorithms in terms of: throughput by 60%, and 20%, bandwidth utilization by 5%, and 7%, packet loss by 50%, and latency by 60%, respectively.
In the light of the globalization Which surrounds the business environment and whose impact has been reflected on industrial economic units the whole world has become a single market that affects its variables on all units and is affected by the economic contribution of each economic unit as much as its share. The problem of this research is that the use of Pareto analysis enables industrial economic units to diagnose the risks surrounding them , so the main objective of the research was to classify risks into both internal and external types and identify any risks that require more attention.
The research was based on the hypothesis that Pareto analysis used, risks can be identified and addressed before they occur.
... Show MoreThis paper aims to study the chemical degradation of Brilliant Green in water via photo-Fenton (H2O2/Fe2+/UV) and Fenton (H2O2/Fe2+) reaction. Fe- B nano particles are applied as incrustation in the inner wall surface of reactor. The data form X- Ray diffraction (XRD) analysis that Fe- B nanocomposite catalyst consist mainly of SiO2 (quartz) and Fe2O3 (hematite) crystallites. B.G dye degradation is estimated to discover the catalytic action of Fe- B synthesized surface in the presence of UVC light and hydrogen peroxide. B.G dye solution with 10 ppm primary concentration is reduced by 99.9% under the later parameter 2ml H2O2, pH= 7, temperature =25°C within 10 min. It is clear that pH of the solution affects the photo- catalytic degradation
... Show MoreObjective(s): To assess mothers’ knowledge about their children with sickle cell anemia and non-Pharmacological approaches to pain management and found some relationship between mothers knowledge and their demographic data of age, level of education, and occupation.
Methodology: A descriptive design used in the present study established was for a period from September 19th, 2020 to March 30th, 2021. The study was conducted on a non-probability (purposive) sample of (30) mother their children with sickle cell anemia was chosen. The data were analyzed through the application of descriptive and inferential statistical approaches which are applied by using SPSS version 22.0.
Results: The findings of the study indicated that moderate
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The paper aims to propose Teaching Learning based Optimization (TLBO) algorithm to solve 3-D packing problem in containers. The objective which can be presented in a mathematical model is optimizing the space usage in a container. Besides the interaction effect between students and teacher, this algorithm also observes the learning process between students in the classroom which does not need any control parameters. Thus, TLBO provides the teachers phase and students phase as its main updating process to find the best solution. More precisely, to validate the algorithm effectiveness, it was implemented in three sample cases. There was small data which had 5 size-types of items with 12 units, medium data which had 10 size-types of items w
... Show MoreAt present, smooth movement on the roads is a matter which is needed for each user. Many roads, especially in urban areas geometrically improved because of the number of vehicles increase from time to time.
In this research, Highway capacity software, HCS, 2000, will be adopted to determine the effectiveness of roundabout in terms of capacity of roundabout, delay and level of service of roundabout.
The results of the analysis indicated that the Ahmed Urabi roundabout operates under level of service F with an average control delay of 300 seconds per vehicle during the peak hours.
The through movements of Alkarrada- Aljadiriya direction (Major Direction) represent the heaviest traff
... Show MoreThe aim of this research is to explore the time and space distribution of traffic volume demand and investigate its vehicle compositions. The four selected links presented the activity of transportation facilities and different congestion points according to directions. The study area belongs to Al-Rusafa sector in Baghdad city that exhibited higher rate of traffic congestions of working days at peak morning and evening periods due to the different mixed land uses. The obtained results showed that Link (1) from Medical city intersection to Sarafiya intersection, demonstrated the highest traffic volume in both peak time periods morning AM and afternoon PM where the demand exceeds the capacity along the link corridor. Also, higher values f
... Show More