With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Reschedule Algorithm (called SDN-RA) for cloud data center networks. The SDN-RA performance is validated and compared as results to other two corresponding SDN; ECMP and Hedera methods. The simulation environment of current work implemented using Fat-Tree topology over Mininet emulator which is connected to the Ryu-SDN controller. The performance evaluation of SDN-RA shows an increase in the network in terms of throughput and link utilization besides a reduction of RTT delay and loss rate.
A Genetic Algorithm optimization model is used in this study to find the optimum flow values of the Tigris river branches near Ammara city, which their water is to be used for central marshes restoration after mixing in Maissan River. These tributaries are Al-Areed, AlBittera and Al-Majar Al-Kabeer Rivers. The aim of this model is to enhance the water quality in Maissan River, hence provide acceptable water quality for marsh restoration. The model is applied for different water quality change scenarios ,i.e. , 10%,20% increase in EC,TDS and BOD. The model output are the optimum flow values for the three rivers while, the input data are monthly flows(1994-2011),monthly water requirements and water quality parameters (EC, TDS, BOD, DO and
... Show MoreThe efficiency evaluation of the railway lines performance is done through a set of indicators and criteria, the most important are transport density, the productivity of enrollee, passenger vehicle production, the productivity of freight wagon, and the productivity of locomotives. This study includes an attempt to calculate the most important of these indicators which transport density index from productivity during the four indicators, using artificial neural network technology. Two neural networks software are used in this study, (Simulnet) and (Neuframe), the results of second program has been adopted. Training results and test to the neural network data used in the study, which are obtained from the international in
... Show MoreCommunity detection is useful for better understanding the structure of complex networks. It aids in the extraction of the required information from such networks and has a vital role in different fields that range from healthcare to regional geography, economics, human interactions, and mobility. The method for detecting the structure of communities involves the partitioning of complex networks into groups of nodes, with extensive connections within community and sparse connections with other communities. In the literature, two main measures, namely the Modularity (Q) and Normalized Mutual Information (NMI) have been used for evaluating the validation and quality of the detected community structures. Although many optimization algo
... Show MoreDesign and build a center basins new p-type four mirrors were studied its effect on all parameters evaluating the performance of the solar cell silicon in the absence of a cooling system is switched on and noted that the efficiency of the performance Hzzh cell increased from 11.94 to 21 without cooling either with cooling has increased the efficiency of the
The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show More3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D mo
... Show More
Abstract:
We can notice cluster data in social, health and behavioral sciences, so this type of data have a link between its observations and we can express these clusters through the relationship between measurements on units within the same group.
In this research, I estimate the reliability function of cluster function by using the seemingly unrelate
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreThe study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show More