To ensure fault tolerance and distributed management, distributed protocols are employed as one of the major architectural concepts underlying the Internet. However, inefficiency, instability and fragility could be potentially overcome with the help of the novel networking architecture called software-defined networking (SDN). The main property of this architecture is the separation of the control and data planes. To reduce congestion and thus improve latency and throughput, there must be homogeneous distribution of the traffic load over the different network paths. This paper presents a smart flow steering agent (SFSA) for data flow routing based on current network conditions. To enhance throughput and minimize latency, the SFSA distributes network traffic to suitable paths, in addition to supervising link and path loads. A scenario with a minimum spanning tree (MST) routing algorithm and another with open shortest path first (OSPF) routing algorithms were employed to assess the SFSA. By comparison, to these two routing algorithms, the suggested SFSA strategy determined a reduction of 2% in packets dropped ratio (PDR), a reduction of 15-45% in end-to-end delay according to the traffic produced, as well as a reduction of 23% in round trip time (RTT). The Mininet emulator and POX controller were employed to conduct the simulation. Another advantage of the SFSA over the MST and OSPF is that its implementation and recovery time do not exhibit fluctuations. The smart flow steering agent will open a new horizon for deploying new smart agents in SDN that enhance network programmability and management.
There is an assumption implicit but fundamental theory behind the decline by the time series used in the estimate, namely that the time series has a sleep feature Stationary or the language of Engle Gernger chains are integrated level zero, which indicated by I (0). It is well known, for example, tables of t-statistic is designed primarily to deal with the results of the regression that uses static strings. This assumption has been previously treated as an axiom the mid-seventies, where researchers are conducting studies of applied without taking into account the properties of time series used prior to the assessment, was to accept the results of these tests Bmanueh and delivery capabilities based on the applicability of the theo
... Show MoreThe financial markets are one of the sectors whose data is characterized by continuous movement in most of the times and it is constantly changing, so it is difficult to predict its trends , and this leads to the need of methods , means and techniques for making decisions, and that pushes investors and analysts in the financial markets to use various and different methods in order to reach at predicting the movement of the direction of the financial markets. In order to reach the goal of making decisions in different investments, where the algorithm of the support vector machine and the CART regression tree algorithm are used to classify the stock data in order to determine
... Show MoreLong memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreThe changes that happen in training progress especially in skill side demands tests researchers found poorness for patterns to special tests in defending skills against shooting although applying these tests on other samples without considering the age , so the problem of research came controlling the tests of defense against shooting that serve juniors players and match the original society to serve the basketball game . The research sample chosen for juniors players for Baghdad clubs in season (2011-2012) the sample numbered (82) players represented (7) clubs that their percentage (82%) , after gaining the data processed statically and we reached to conclusion that : defending tests against shooting applied first time in Iraq in Iraq envi
... Show MoreGlobal Navigation Satellite Systems (GNSS) have become an integral part of wide range of applications. One of these applications of GNSS is implementation of the cellular phone to locate the position of users and this technology has been employed in social media applications. Moreover, GNSS have been effectively employed in transportation, GIS, mobile satellite communications, and etc. On the other hand, the geomatics sciences use the GNSS for many practical and scientific applications such as surveying and mapping and monitoring, etc.
In this study, the GNSS raw data of ISER CORS, which is located in the North of Iraq, are processed and analyzed to build up coordinate time series for the purpose of detection the
... Show MoreThe steel jetty selected for strengthening is in Baghdad city, over Tigris River, consists of 55 short spans, each of approximately 4 meters and one naviga-tional opening of 12 m. The bridge is 224 meters length and 8 meters in width. The strengthening system was designed to remove overstresses that occurred when the bridge was subjected to abnormal loads of 380 tons. A strengthening system which installed in spring 2008 was used where the main concept is to depend on added side supporting elements which impose reversal forces on the bridge to counteract most of the loads expected from the abnormal heavy loads. The bridge was load tested before and after the strengthening system was activated. The load test results indicate that the strengt
... Show MoreIn this study, different methods were used for estimating location parameter and scale parameter for extreme value distribution, such as maximum likelihood estimation (MLE) , method of moment estimation (ME),and approximation estimators based on percentiles which is called white method in estimation, as the extreme value distribution is one of exponential distributions. Least squares estimation (OLS) was used, weighted least squares estimation (WLS), ridge regression estimation (Rig), and adjusted ridge regression estimation (ARig) were used. Two parameters for expected value to the percentile as estimation for distribution f
... Show MoreObjectives: The study aims to evaluate the effectiveness of the educational program on nurses’ knowledge towards nursing management for patients undergoing percutaneous coronary intervention (PCI), as well as to find out the relationship between nurses' knowledge and some of their demographic characteristics (age, gender, level of education, and years of experience in cardiac units).
Methodology: A Quasi-experimental as one group (pre and post test) study was conducted at the Heart Center in Al-Diwaniyah city for the period from December 7, 2019 to February 23, 2020. A sample of (40) nurses working in the heart center was chosen from different nursing addresses. The sample covered one gro
... Show More