With the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Reschedule Algorithm (called SDN-RA) for cloud data center networks. The SDN-RA performance is validated and compared as results to other two corresponding SDN; ECMP and Hedera methods. The simulation environment of current work implemented using Fat-Tree topology over Mininet emulator which is connected to the Ryu-SDN controller. The performance evaluation of SDN-RA shows an increase in the network in terms of throughput and link utilization besides a reduction of RTT delay and loss rate.
The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.
For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence
... Show MoreVisual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MorePortable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fu
... Show MoreThe paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain
... Show MoreObjectives: The study aims to evaluate the effectiveness of the educational program on nurses’ knowledge towards nursing management for patients undergoing percutaneous coronary intervention (PCI), as well as to find out the relationship between nurses' knowledge and some of their demographic characteristics (age, gender, level of education, and years of experience in cardiac units).
Methodology: A Quasi-experimental as one group (pre and post test) study was conducted at the Heart Center in Al-Diwaniyah city for the period from December 7, 2019 to February 23, 2020. A sample of (40) nurses working in the heart center was chosen from different nursing addresses. The sample covered one gro
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreA load flow program is developed using MATLAB and based on the Newton–Raphson method,which shows very fast and efficient rate of convergence as well as computationally the proposed method is very efficient and it requires less computer memory through the use of sparsing method and other methods in programming to accelerate the run speed to be near the real time.
The designed program computes the voltage magnitudes and phase angles at each bus of the network under steady–state operating conditions. It also computes the power flow and power losses for all equipment, including transformers and transmission lines taking into consideration the effects of off–nominal, tap and phase shift transformers, generators, shunt capacitors, sh
Rock engineers widely use the uniaxial compressive strength (UCS) of rocks in designing
surface and underground structures. The procedure for measuring this rock strength has been
standardized by both the International Society for Rock Mechanics (ISRM) and American Society
for Testing and Materials (ASTM), Akram and Bakar(2007).
In this paper, an experimental study was performed to correlate of Point Load Index ( Is(50))
and Pulse Wave Velocity (Vp) to the Unconfined Compressive Strength (UCS) of Rocks. The effect
of several parameters was studied. Point load test, Unconfined Compressive Strength (UCS) and
Pulse Wave Velocity (Vp) were used for testing several rock samples with different diameters.
The predicted e