Preferred Language
Articles
/
ZRe4UJEBVTCNdQwCt5Qi
Load balance in data center SDN networks
...Show More Authors

In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and forwarding planes. So, due to the rapid increase in the number of applications, websites, storage space, and some of the network resources are being underutilized due to static routing mechanisms. To overcome these limitations, a Software Defined Network based Openflow Data Center network architecture is used to obtain better performance parameters and implementing traffic load balancing function. The load balancing distributes the traffic requests over the connected servers, to diminish network congestions, and reduce underutilization problem of servers. As a result, SDN is developed to afford more effective configuration, enhanced performance, and more flexibility to deal with huge network designs.

Publication Date
Sun Jun 30 2013
Journal Name
Al-kindy College Medical Journal
Evaluation of Etiological Causes of Hypothyroidism among Patients Visiting Specialized Center for Diabetes and Endocrinology
...Show More Authors

Background: Many structural or functional abnormalities can impair the production of thyroid hormones and cause hypothyroidism.Objectives: to identify the main etiological causes of hypothyroidism among patients visiting Specialized Center for Diabetes and Endocrinology.Methods: This study was conducted in the Specialized Center for Diabetes and Endocrinology on 217 patients with proved hypothyroidism, from 2006 to 2008. Every patient was tested with thyroid function tests, Ultrasound examination, thyroid autoantibodies, fine needle aspiration, radiology of skull, isotopes scan, also checking adrenal and gonadal function. Results: Out of these 217 patients 120 patients have thyroiditis 33 patients had been undergone thyroidectomy. 39 pat

... Show More
View Publication Preview PDF
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
EVALUATION OF ELECTRONIC GOVERNMENT SECURITY ISSUES APPLIED TO COMPUTER CENTER OF BAGHDAD UNIVERSITY (CASE STUDY)
...Show More Authors

Information security contributes directly to increase the level of trust between the government’s departments by providing an assurance of confidentiality, integrity, and availability of sensitive governmental information. Many threats that are caused mainly by malicious acts can shutdown the egovernment services. Therefore the governments are urged to implement security in e-government projects.
Some modifications were proposed to the security assessment multi-layer model (Sabri model) to be more comprehensive model and more convenient for the Iraqi government. The proposed model can be used as a tool to assess the level of security readiness of government departments, a checklist for the required security measures and as a commo

... Show More
View Publication Preview PDF
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Contemporary Challenges for Cloud Computing Data Governance in Information Centers: An analytical study
...Show More Authors

Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Aug 15 2023
Journal Name
Journal Of Economics And Administrative Sciences
Machine Learning Techniques for Analyzing Survival Data of Breast Cancer Patients in Baghdad
...Show More Authors

The Machine learning methods, which are one of the most important branches of promising artificial intelligence, have great importance in all sciences such as engineering, medical, and also recently involved widely in statistical sciences and its various branches, including analysis of survival, as it can be considered a new branch used to estimate the survival and was parallel with parametric, nonparametric and semi-parametric methods that are widely used to estimate survival in statistical research. In this paper, the estimate of survival based on medical images of patients with breast cancer who receive their treatment in Iraqi hospitals was discussed. Three algorithms for feature extraction were explained: The first principal compone

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Apr 11 2011
Journal Name
Icgst
Employing Neural Network and Naive Bayesian Classifier in Mining Data for Car Evaluation
...Show More Authors

In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.

Publication Date
Fri Mar 15 2019
Journal Name
Alustath Journal For Human And Social Sciences
A Developmental-Longitudinal Study of Request External Modifiers in Authentic and Elicited Data
...Show More Authors

View Publication
Crossref
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Fri Mar 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Stability testing of time series data for CT Large industrial establishments in Iraq
...Show More Authors

Abstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of ​​cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.

Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jul 01 2020
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Fast and robust approach for data security in communication channel using pascal matrix
...Show More Authors

This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Crossref
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me

... Show More
View Publication Preview PDF
Crossref