In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configurations, efficient enhancements and further elasticity to handle massive network schemes. In this paper the opendaylight controller (ODL-CO) with new version OF 1.4 protocol and the ant colony optimization algorithm is proposed to test the performance of the LB function using IPv6 in a SDN-DC network by studying the throughput, data transfer, bandwidth and average delay performance of the networking parameters before and after use of the LB algorithm. As a result, after applying the LB, the throughput, data transfer and bandwidth performance increased, while the average delay decreased.
The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat
... Show MoreWe propose a novel strategy to optimize the test suite required for testing both hardware and software in a production line. Here, the strategy is based on two processes: Quality Signing Process and Quality Verification Process, respectively. Unlike earlier work, the proposed strategy is based on integration of black box and white box techniques in order to derive an optimum test suite during the Quality Signing Process. In this case, the generated optimal test suite significantly improves the Quality Verification Process. Considering both processes, the novelty of the proposed strategy is the fact that the optimization and reduction of test suite is performed by selecting only mutant killing test cases from cumulating t-way test ca
... Show MoreThis work is devoted to the modeling of streamer discharge, propagation in liquid dielectrics (water) gap using the bubble theory. This of the electrical discharge (streamer) propagating within a dielectric liquid subjected to a divergent electric field, using finite element method (in two dimensions). Solution of Laplace's equation governs the voltage and electric field distributions within the configuration, the electrode configuration a point (pin) - plane configuration, the plasma channels were followed, step to step. The results show that, the electrical discharge (streamer) indicates the breakdown voltage required for a 3mm atmospheric pressure dielectric liquid gap as 13 kV. Also, the electric potential and field distributions sho
... Show MoreSocial networking sites represent one of the modern communication technologies that have contributed to the expression of public opinion trends towards various events and crises of which security crisis is most important being characterized by its ability to influence the community life of the public. In order to recognize its role in shaping opinions of the educated class of the public that is characterized by a high level of knowledge, culture and having experience in dealing with the media. Its advantage is that they have an active audience by expressing their views on the situations, events, and news published on them as well as expressing their attitudes and sympathy with the events. So a number of questions are included in the ques
... Show MoreDue to the huge variety of 5G services, Network slicing is promising mechanism for dividing the physical network resources in to multiple logical network slices according to the requirements of each user. Highly accurate and fast traffic classification algorithm is required to ensure better Quality of Service (QoS) and effective network slicing. Fine-grained resource allocation can be realized by Software Defined Networking (SDN) with centralized controlling of network resources. However, the relevant research activities have concentrated on the deep learning systems which consume enormous computation and storage requirements of SDN controller that results in limitations of speed and accuracy of traffic classification mechanism. To fill thi
... Show More<p>The demand for internet applications has increased rapidly. Providing quality of service (QoS) requirements for varied internet application is a challenging task. One important factor that is significantly affected on the QoS service is the transport layer. The transport layer provides end-to-end data transmission across a network. Currently, the most common transport protocols used by internet application are TCP (Transmission Control Protocol) and UDP (User Datagram Protocol). Also, there are recent transport protocols such as DCCP (data congestion control protocol), SCTP (stream congestion transmission protocol), and TFRC (TCP-friendly rate control), which are in the standardization process of Internet Engineering Task
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show More