With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Reschedule Algorithm (called SDN-RA) for cloud data center networks. The SDN-RA performance is validated and compared as results to other two corresponding SDN; ECMP and Hedera methods. The simulation environment of current work implemented using Fat-Tree topology over Mininet emulator which is connected to the Ryu-SDN controller. The performance evaluation of SDN-RA shows an increase in the network in terms of throughput and link utilization besides a reduction of RTT delay and loss rate.
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreBack ground: Chronic total occlusion (CTO) of coronary arteries remains one of the most challenging lesion subsets in interventional cardiology even with the development of medical devices and operator expertise. Successful revascularization results in improved in angina status ,increased exercise capacity and reduces the need for lat CABG surgery .
Objectives: This study sought to determine the overall procedural success rate of percutaneous coronary intervention (PCI) for CTOs and to examine the relation between variables such as; patients’ characteristics, risk factors, lesion characteristics and procedural success rate.
Methods: In this study ,clinical and coronary angiography data of (80) patients with CTO who underwent PCI
Noor oil field is one of smallest fields in Missan province. Twelve well penetrates the Mishrif Formation in Noor field and eight of them were selected for this study. Mishrif formation is one of the most important reservoirs in Noor field and it consists of one anticline dome and bounded by the Khasib formation at the top and the Rumaila formation at the bottom. The reservoir was divided into eight units separated by isolated units according to partition taken by a rounding fields.
In this paper histograms frequency distribution of the porosity, permeability, and water saturation were plotted for MA unit of Mishrif formation in Noor field, and then transformed to the normal distribution by applying the Box-Cox transformation alg
... Show MoreThe high bounce activity according to the fosbery way is regarded as of the difficult sports concerning its way of training and perfection due to hard technique of its performance on one hand and because it depends on the player’s ability to overcome body weight resistance against the gravity. In addition to the strong ability to control the body posture when leaving the land and flying over the barrier. This activity needs to high plosion power at the moment of bouncing and this plosion depends on the period of bouncing, so the two researchers aimed to use a mechanical bouncing platform and an electronic one through several training by one foot and both feet in different directions and positions in order to reduce the time of bouncing an
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Advances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship an
... Show MoreThe agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreThis paper presents a method to classify colored textural images of skin tissues. Since medical images havehighly heterogeneity, the development of reliable skin-cancer detection process is difficult, and a mono fractaldimension is not sufficient to classify images of this nature. A multifractal-based feature vectors are suggested hereas an alternative and more effective tool. At the same time multiple color channels are used to get more descriptivefeatures.Two multifractal based set of features are suggested here. The first set measures the local roughness property, whilethe second set measure the local contrast property.A combination of all the extracted features from the three colormodels gives a highest classification accuracy with 99.4
... Show More