Software-defined networks (SDN) have a centralized control architecture that makes them a tempting target for cyber attackers. One of the major threats is distributed denial of service (DDoS) attacks. It aims to exhaust network resources to make its services unavailable to legitimate users. DDoS attack detection based on machine learning algorithms is considered one of the most used techniques in SDN security. In this paper, four machine learning techniques (Random Forest, K-nearest neighbors, Naive Bayes, and Logistic Regression) have been tested to detect DDoS attacks. Also, a mitigation technique has been used to eliminate the attack effect on SDN. RF and KNN were selected because of their high accuracy results. Three types of network topology have been generated to observe the effectiveness of proposed algorithms on different network architectures. The results reveal that RF performs better than KNN in a single topology, and both have close performance in other topologies.
This paper proposes improving the structure of the neural controller based on the identification model for nonlinear systems. The goal of this work is to employ the structure of the Modified Elman Neural Network (MENN) model into the NARMA-L2 structure instead of Multi-Layer Perceptron (MLP) model in order to construct a new hybrid neural structure that can be used as an identifier model and a nonlinear controller for the SISO linear or nonlinear systems. Two learning algorithms are used to adjust the parameters weight of the hybrid neural structure with its serial-parallel configuration; the first one is supervised learning algorithm based Back Propagation Algorithm (BPA) and the second one is an intelligent algorithm n
... Show MoreA new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreObjectives Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficiency
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreObjectives: Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods: The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficie
... Show MoreAkaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreA substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show MoreGround-based active optical sensors (GBAOS) have been successfully used in agriculture to predict crop yield potential (YP) early in the season and to improvise N rates for optimal crop yield. However, the models were found weak or inconsistent due to environmental variation especially rainfall. The objectives of the study were to evaluate if GBAOS could predict YP across multiple locations, soil types, cultivation systems, and rainfall differences. This study was carried from 2011 to 2013 on corn (Zea mays L.) in North Dakota, and in 2017 in potatoes in Maine. Six N rates were used on 50 sites in North Dakota and 12 N rates on two sites, one dryland and one irrigated, in Maine. Two active GBAOS used for this study were GreenSeeker and Holl
... Show More