أثبتت الشبكات المحددة بالبرمجيات (SDN) تفوقها في معالجة مشاكل الشبكة العادية مثل قابلية التوسع وخفة الحركة والأمن. تأتي هذه الميزة من SDN بسبب فصل مستوى التحكم عن مستوى البيانات. على الرغم من وجود العديد من الأوراق والدراسات التي تركز على إدارة SDN، والرصد، والتحكم، وتحسين QoS، إلا أن القليل منها يركز على تقديم ما يستخدمونه لتوليد حركة المرور وقياس أداء الشبكة. كما أن المؤلفات تفتقر إلى مقارنات بين الأدوات والأساليب المستخدمة في هذا السياق. تقدم هذه الورقة كيفية محاكاة إحصاءات المرور وتوليدها والحصول عليها من بيئة SDN. وبالإضافة إلى ذلك، تعالج المقارنة بين الأساليب المستخدمة في جمع بيانات شبكة المعرفة برمجياً لاستكشاف قدرة كل طريقة، وبالتالي تحديد البيئة المناسبة لكل طريقة. تمت محاكاة اختبار SDN باستخدام برنامج Mininet مع طوبولوجيا الأشجار ومفاتيح OpenFlow. تم توصيل وحدة تحكم RYU بإرسال التحكم. تُستخدم الأدوات الشهيرة iperf3 و ping و python scripts لجمع مجموعات بيانات الشبكة من عدة أجهزة في الشبكة. تم استخدام Wireshark وتطبيقات RYU وأمر ovs-ofctl لمراقبة مجموعة البيانات المجمعة. تظهر النتائج نجاحًا في إنشاء عدة أنواع من مقاييس الشبكة لاستخدامها في المستقبل لتدريب الآلة أو خوارزميات التعلم العميق. وخلصت إلى أنه عند توليد البيانات لغرض التحكم في الازدحام، فإن iperf3 هو أفضل أداة، في حين أن ping مفيد عند توليد البيانات لغرض الكشف عن هجمات DDoS. تعد تطبيقات RYU أكثر ملاءمة للاستفسار عن جميع تفاصيل طوبولوجيا الشبكة نظرًا لقدرتها على عرض الطوبولوجيا وخصائص التبديل وإحصائيات التبديل. كما تم استكشاف العديد من العقبات والأخطاء وإدراجها ليتم منعها عندما يحاول الباحثون إنشاء مجموعات البيانات هذه في جهودهم العلمية التالية.
In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreIn this paper, image compression technique is presented based on the Zonal transform method. The DCT, Walsh, and Hadamard transform techniques are also implements. These different transforms are applied on SAR images using Different block size. The effects of implementing these different transforms are investigated. The main shortcoming associated with this radar imagery system is the presence of the speckle noise, which affected the compression results.
Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreA multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreThe transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreOpenStreetMap (OSM) represents the most common example of online volunteered mapping applications. Most of these platforms are open source spatial data collected by non-experts volunteers using different data collection methods. OSM project aims to provide a free digital map for all the world. The heterogeneity in data collection methods made OSM project databases accuracy is unreliable and must be dealt with caution for any engineering application. This study aims to assess the horizontal positional accuracy of three spatial data sources are OSM road network database, high-resolution Satellite Image (SI), and high-resolution Aerial Photo (AP) of Baghdad city with respect to an analogue formal road network dataset obtain
... Show More