Preferred Language
Articles
/
ZRe4UJEBVTCNdQwCt5Qi
Load balance in data center SDN networks
...Show More Authors

In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and forwarding planes. So, due to the rapid increase in the number of applications, websites, storage space, and some of the network resources are being underutilized due to static routing mechanisms. To overcome these limitations, a Software Defined Network based Openflow Data Center network architecture is used to obtain better performance parameters and implementing traffic load balancing function. The load balancing distributes the traffic requests over the connected servers, to diminish network congestions, and reduce underutilization problem of servers. As a result, SDN is developed to afford more effective configuration, enhanced performance, and more flexibility to deal with huge network designs.

Publication Date
Sun Jul 01 2012
Journal Name
Ieee Transactions On Geoscience And Remote Sensing
Echo Amplitude Normalization of Full-Waveform Airborne Laser Scanning Data Based on Robust Incidence Angle Estimation
...Show More Authors

View Publication
Scopus (26)
Crossref (22)
Scopus Clarivate Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (534)
Crossref (527)
Scopus Clarivate Crossref
Publication Date
Sat Feb 01 2020
Journal Name
Iop Conference Series: Materials Science And Engineering
Revealing the potentials of 3D modelling techniques; a comparison study towards data fusion from hybrid sensors
...Show More Authors
Abstract<p>The vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion sc</p> ... Show More
View Publication
Scopus (12)
Crossref (7)
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Use of information and communications technology to archive data: A suggested form in the Tax Audit and Examination Department of the General Tax Authority
...Show More Authors

The current world is observing huge developments in presenting the opportunity for organizations and administrative units to use information and communication technology and their adoption by administrative work due to its importance in the achievement of work with higher efficiency, speed, and facility of communication with all individuals and companies using various means of communication Depending on the Internet networks. Therefore, the research dealt with the study of electronic systems designed and adopted in the creation or construction of a database for archiving data, which is the main method in organizations and administrative units in developed countries. Where this system works to convert documents, and manual processes and t

... Show More
View Publication Preview PDF
Publication Date
Fri Jun 01 2012
Journal Name
Journal Of Economics And Administrative Sciences
The Effect of the Stability of Some Commodity Activities in Iraq on the Estimation of the Statistical Data Models for the Period (1988-2000)
...Show More Authors

There is an assumption implicit but fundamental theory behind the decline by the time series used in the estimate, namely that the time series has a sleep feature Stationary or the language of Engle Gernger chains are integrated level zero, which indicated by I (0). It is well known, for example, tables of t-statistic is designed primarily to deal with the results of the regression that uses static strings. This assumption has been previously treated as an axiom the mid-seventies, where researchers are conducting studies of applied without taking into account the properties of time series used prior to the assessment, was to accept the results of these tests Bmanueh and delivery capabilities based on the applicability of the theo

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Oct 19 2018
Journal Name
Journal Of Economics And Administrative Sciences
Big Data Approch to Enhance Organizational Ambidexterity An Exploratory Study of a Sample of Managers at ASIA Cell For Mobile Telecommunication Company in Iraq
...Show More Authors

               The research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell  Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.

The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After  a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Thu Feb 01 2024
Journal Name
Baghdad Science Journal
Estimating the Parameters of Exponential-Rayleigh Distribution for Progressively Censoring Data with S- Function about COVID-19
...Show More Authors

The two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Use The moment method to Estimate the Reliability Function Of The Data Of Truncated Skew Normal Distribution
...Show More Authors

The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2013
Journal Name
Proceedings Of The International Astronomical Union
The infrared <i>K</i>-band identification of the DSO/G2 source from VLT and Keck data
...Show More Authors
Abstract<p>A fast moving infrared excess source (G2) which is widely interpreted as a core-less gas and dust cloud approaches Sagittarius A* (Sgr A*) on a presumably elliptical orbit. VLT <italic>K<sub>s</sub></italic>-band and Keck <italic>K</italic>′-band data result in clear continuum identifications and proper motions of this ∼19<sup><italic>m</italic></sup> Dusty S-cluster Object (DSO). In 2002-2007 it is confused with the star S63, but free of confusion again since 2007. Its near-infrared (NIR) colors and a comparison to other sources in the field speak in favor of the DSO being an IR excess star with photospheric continuum emission at 2 microns than a</p> ... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Mon Apr 03 2023
Journal Name
Journal Of Electronics,computer Networking And Applied Mathematics
Comparison of Some Estimator Methods of Regression Mixed Model for the Multilinearity Problem and High – Dimensional Data
...Show More Authors

In order to obtain a mixed model with high significance and accurate alertness, it is necessary to search for the method that performs the task of selecting the most important variables to be included in the model, especially when the data under study suffers from the problem of multicollinearity as well as the problem of high dimensions. The research aims to compare some methods of choosing the explanatory variables and the estimation of the parameters of the regression model, which are Bayesian Ridge Regression (unbiased) and the adaptive Lasso regression model, using simulation. MSE was used to compare the methods.

View Publication
Crossref