In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configurations, efficient enhancements and further elasticity to handle massive network schemes. In this paper the opendaylight controller (ODL-CO) with new version OF 1.4 protocol and the ant colony optimization algorithm is proposed to test the performance of the LB function using IPv6 in a SDN-DC network by studying the throughput, data transfer, bandwidth and average delay performance of the networking parameters before and after use of the LB algorithm. As a result, after applying the LB, the throughput, data transfer and bandwidth performance increased, while the average delay decreased.
Gravity and magnetic data are used to study the tectonic situation of Al-Kut- Al-
Hai and surrounding areas in central Iraq. The study included application of many
processing and interpretation programs. The window method with different spacing
was used to separate the residual from regional anomalies for gravity and magnetic
data. The Total Horizontal Derivative (THDR) techniques used to identify the fault
trends in the basement and sedimentary cover rocks depending upon gravity and
magnetic data. The identified faults in the study area show (NW-SE), (NE-SW) (NS)
and (E-W) trends. It is believed that these faults extending from the basement to
the upper most layer of the sedimentary cover rocks.
Big data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However
... Show MoreAbstract
A surface fitting model is developed based on calorimeter data for two famous brands of household compressors. Correlation equations of ten coefficient polynomials were found as a function of refrigerant saturating and evaporating temperatures in range of (-35℃ to -10℃) using Matlab software for cooling capacity, power consumption, and refrigerant mass flow rate.
Additional correlations equations for these variables as a quick choice selection for a proper compressor use at ASHRAE standard that cover a range of swept volume range (2.24-11.15) cm3.
The result indicated that these surface fitting models are accurate with in ± 15% for 72 compressors model of cooling cap
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of cove
... Show MoreThe current research discussed biophysics data as a theoretical and applied knowledge base linking industrial design with the natural sciences at the level of applied strategies through which we can enrich the knowledge base of industrial design. The research focused on two main aspects of the scientific references for biophysics, namely: electromagnetism, and biomechanics. According to the performance and functional applications in designing the functions of industrial products at the electromagnetic level, it was found that remote sensing applications: such as fire sensors that were adopted from the insect (Black Beetle) and that their metaphors enable them to hear fire, and collision sensors, which were adopted from the insect
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreIn this paper, ARIMA model was used for Estimating the missing data(air temperature, relative humidity, wind speed) for mean monthly variables in different time series at three stations (Sinjar, Baghdad , AL.Hai) which represented different parts of Iraq from north to south respectively
Nowadays, the process of ontology learning for describing heterogeneous systems is an influential phenomenon to enhance the effectiveness of such systems using Social Network representation and Analysis (SNA). This paper presents a novel scenario for constructing adaptive architecture to develop community performance for heterogeneous communities as a case study. The crawling of the semantic webs is a new approach to create a huge data repository for classifying these communities. The architecture of the proposed system involves two cascading modules in achieving the ontology data, which is represented in Resource Description Framework (RDF) format. The proposed system improves the enhancement of these environments ach
... Show MoreThis research includes structure interpretation of the Yamama Formation (Lower Cretaceous) and the Naokelekan Formation (Jurassic) using 2D seismic reflection data of the Tuba oil field region, Basrah, southern Iraq. The two reflectors (Yamama and Naokelekan) were defined and picked as peak and tough depending on the 2D seismic reflection interpretation process, based on the synthetic seismogram and well log data. In order to obtain structural settings, these horizons were followed over all the regions. Two-way travel-time maps, depth maps, and velocity maps have been produced for top Yamama and top Naokelekan formations. The study concluded that certain longitudinal enclosures reflect anticlines in the east and west of the study ar
... Show More