In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configurations, efficient enhancements and further elasticity to handle massive network schemes. In this paper the opendaylight controller (ODL-CO) with new version OF 1.4 protocol and the ant colony optimization algorithm is proposed to test the performance of the LB function using IPv6 in a SDN-DC network by studying the throughput, data transfer, bandwidth and average delay performance of the networking parameters before and after use of the LB algorithm. As a result, after applying the LB, the throughput, data transfer and bandwidth performance increased, while the average delay decreased.
Kirchhoff Time Migration method was applied in pre-and post-Stack Time Migration for post-processing of images collected from Balad-Samarra (BS-92) survey line that is sited across Ajeel anticline oilfield. The results showed that Ajeel anticline structure was relocated at the correct position in the migrated stacked section. The two methods (Pre and Post) of migration processing showed enhanced subsurface images and increased horizontal resolution, which was clear after the broadening the syncline and narrowing or compressing the anticline. However, each of these methods was associated with migration noise. Thus, a Post-Stack process was applied using Dip-Removal (DDMED) and Band-Pass filters to eliminate the artifact noise. The time-fr
... Show MoreA three-dimensional (3D) model extraction represents the best way to reflect the reality in all details. This explains the trends and tendency of many scientific disciplines towards making measurements, calculations and monitoring in various fields using such model. Although there are many ways to produce the 3D model like as images, integration techniques, and laser scanning, however, the quality of their products is not the same in terms of accuracy and detail. This article aims to assess the 3D point clouds model accuracy results from close range images and laser scan data based on Agi soft photoscan and cloud compare software to determine the compatibility of both datasets for several applications. College of Scien
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThe paired sample t-test is a type of classical test statistics that is used to test the difference between two means in paired data, but it is not robust against the violation of the normality assumption. In this paper, some alternative robust tests are suggested by combining the Jackknife resampling with each of the Wilcoxon signed-rank test for small sample size and Wilcoxon signed-rank test for large sample size, using normal approximation. The Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these tests depending on the type one error rates and the power rates of the test statistics. All these tests were applied on different sa
... Show MoreWe studied the effect of Ca- doping on the properties of Bi-based superconductors by
adding differ ent amounts of CaO
to the Bi
2
Sr2La2-xCaxCu3O10+δ
compound. consequently, we
obtained three samples A,B and C with x=0.0, 0.4 and 0.8 respectively. The usual solid-state
reaction method has been applied under optimum conditions. The x-ray diffraction analy sis
showed that the samples A and B have tetragonal structures conversely the sample C has an
orthorhombic structure. In addition XRD analysis show that decreasing the c-axis lattice
constant and thus decreasing the ratio c/a for samples A,B and C resp ectively. The X-ray
florescence proved that the compositions of samples A,B and C with the ra
The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreRecent researches showed that DNA encoding and pattern matching can be used for the intrusion-detection system (IDS), with results of high rate of attack detection. The evaluation of these intrusion detection systems is based on datasets that are generated decades ago. However, numerous studies outlined that these datasets neither inclusively reflect the network traffic, nor the modern low footprint attacks, and do not cover the current network threat environment. In this paper, a new DNA encoding for misuse IDS based on UNSW-NB15 dataset is proposed. The proposed system is performed by building a DNA encoding for all values of 49 attributes. Then attack keys (based on attack signatures) are extracted and, finally, Raita algorithm is app
... Show MoreThis paper is carried out to detect the subsurface structures that have geological
and economical importance by interpreting the available reflection seismic data of
an area estimated to be about (740) km2. The Khashim Al-Ahmer structure is partial
of series structures of (Injana – Khashim Al-Ahmer – Mannsorya) from the (NW to
the SE), it is located within for deep faulted area. The component of the one
elongated dome of asymmetrical of structure which has(SW) limb more steeper than
the (NE) limb.Twenty three seismic sections had been interpreted for two seismic
surveys and the total length of all seismic lines is about (414.7) Km. Interpretation
of seismic data was focused on two reflectors (Fatha and Jeribi)
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreMissing data is one of the problems that may occur in regression models. This problem is usually handled by deletion mechanism available in statistical software. This method reduces statistical inference values because deletion affects sample size. In this paper, Expectation Maximization algorithm (EM), Multicycle-Expectation-Conditional Maximization algorithm (MC-ECM), Expectation-Conditional Maximization Either (ECME), and Recurrent Neural Networks (RNN) are used to estimate multiple regression models when explanatory variables have some missing values. Experimental dataset were generated using Visual Basic programming language with missing values of explanatory variables according to a missing mechanism at random general pattern and s
... Show More