أثبتت الشبكات المحددة بالبرمجيات (SDN) تفوقها في معالجة مشاكل الشبكة العادية مثل قابلية التوسع وخفة الحركة والأمن. تأتي هذه الميزة من SDN بسبب فصل مستوى التحكم عن مستوى البيانات. على الرغم من وجود العديد من الأوراق والدراسات التي تركز على إدارة SDN، والرصد، والتحكم، وتحسين QoS، إلا أن القليل منها يركز على تقديم ما يستخدمونه لتوليد حركة المرور وقياس أداء الشبكة. كما أن المؤلفات تفتقر إلى مقارنات بين الأدوات والأساليب المستخدمة في هذا السياق. تقدم هذه الورقة كيفية محاكاة إحصاءات المرور وتوليدها والحصول عليها من بيئة SDN. وبالإضافة إلى ذلك، تعالج المقارنة بين الأساليب المستخدمة في جمع بيانات شبكة المعرفة برمجياً لاستكشاف قدرة كل طريقة، وبالتالي تحديد البيئة المناسبة لكل طريقة. تمت محاكاة اختبار SDN باستخدام برنامج Mininet مع طوبولوجيا الأشجار ومفاتيح OpenFlow. تم توصيل وحدة تحكم RYU بإرسال التحكم. تُستخدم الأدوات الشهيرة iperf3 و ping و python scripts لجمع مجموعات بيانات الشبكة من عدة أجهزة في الشبكة. تم استخدام Wireshark وتطبيقات RYU وأمر ovs-ofctl لمراقبة مجموعة البيانات المجمعة. تظهر النتائج نجاحًا في إنشاء عدة أنواع من مقاييس الشبكة لاستخدامها في المستقبل لتدريب الآلة أو خوارزميات التعلم العميق. وخلصت إلى أنه عند توليد البيانات لغرض التحكم في الازدحام، فإن iperf3 هو أفضل أداة، في حين أن ping مفيد عند توليد البيانات لغرض الكشف عن هجمات DDoS. تعد تطبيقات RYU أكثر ملاءمة للاستفسار عن جميع تفاصيل طوبولوجيا الشبكة نظرًا لقدرتها على عرض الطوبولوجيا وخصائص التبديل وإحصائيات التبديل. كما تم استكشاف العديد من العقبات والأخطاء وإدراجها ليتم منعها عندما يحاول الباحثون إنشاء مجموعات البيانات هذه في جهودهم العلمية التالية.
Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Cloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThe city of Karbala is one of the most important holy places for visitors and pilgrims from the Islamic faith, especially through the Arabian visit, when crowds of millions gather to commemorate the martyrdom of Imam Hussein. Offering services and medical treatments during this time is very important, especially when the crowds head to their destination (the holy shrine of Imam Hussein (a.s)). In recent years, the Arba'in visit has witnessed an obvious growth in the number of participants. The biggest challenge is the health risks, and the preventive measures for both organizers and visitors. Researchers identified various challenges and factors to facilitating the Arba'in visit. The purpose of this research is to deal with the religious an
... Show More 
      The process of soil classification in Iraq for industrial purposes is important topics that need to be extensive and specialized studies. In order for the advancement of reality service and industrial in our dear country, that a lot of scientific research touched upon the soil classification in the agricultural, commercial and other fields. No source and research can be found that touched upon the classification of land for industrial purposes directly. In this research specialized programs have been used such as geographic information system software The geographical information system permits the study of local distribution of phenomena, activities and the aims that can be determined in the loca
In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show More 
        